Our strategy for batch import/upload

Hey everyone,

I have been taking a look at some recent work we have been doing to enable batch import/upload of data into a service and had a question. The example in particular is the cold chain equipment catalog items (CatalogItem) in the CCE service (https://github.com/OpenLMIS/openlmis-cce ). I see we are following the pattern for v2 (and copying that code), but I am wondering if there is a reason why we are not following the pattern established by the RefData Seed Tool? (https://github.com/OpenLMIS/openlmis-refdata-seed ) I know the CCE catalog data that is being uploaded is not reference data, but it seems that we could leverage the code that was created for that tool, and make it generic for any kind of data import. One particular concern I have with recreating the v2 pattern is filling the Java domain/entity objects with import annotations, and tying the CSV upload structure to the database schema.

Shalom,

Chongsun

– ​

There are 10 kinds of people in this world; those who understand binary, and those who don’t.

Chongsun Ahn | chongsun.ahn@villagereach.org

Software Development Engineer

Village****Reach* ** Starting at the Last Mile*

2900 Eastlake Ave. E, Suite 230, Seattle, WA 98102, USA

DIRECT: 1.206.512.1536 **CELL: **1.206.910.0973 FAX: 1.206.860.6972

SKYPE: chongsun.ahn.vr

www.villagereach.org

Connect on Facebook****, Twitter** ** and our Blog

Hi Chongsun,

Considering v2 approach, we do not have to use domain/entity objects for make upload structure (with ImportField annotations). Actually, maybe it is a good idea to use for that some transfer objects that will almost never change and just import to entity from that object. This will allow us to change database structure without any effects to upload csv and vice versa. I don’t know pattern established by the RefData Seed Tool. Sebastian, can you tell us what capabilities it has? From upload framework we do need easy possibility to adjust header names and make fields non/mandatory. Maybe we should think about combine two approaches and prepare generic library for upload CSV that can be used across services.

Paweł

···

On Wednesday, July 19, 2017 at 7:06:32 AM UTC+2, chongsun.ahn wrote:

Hey everyone,

I have been taking a look at some recent work we have been doing to enable batch import/upload of data into a service and had a question. The example in particular is the cold chain equipment catalog items (CatalogItem) in the CCE service (https://github.com/OpenLMIS/openlmis-cce ). I see we are following the pattern for v2 (and copying that code), but I am wondering if there is a reason why we are not following the pattern established by the RefData Seed Tool? (https://github.com/OpenLMIS/openlmis-refdata-seed ) I know the CCE catalog data that is being uploaded is not reference data, but it seems that we could leverage the code that was created for that tool, and make it generic for any kind of data import. One particular concern I have with recreating the v2 pattern is filling the Java domain/entity objects with import annotations, and tying the CSV upload structure to the database schema.

Shalom,

Chongsun

– ​

There are 10 kinds of people in this world; those who understand binary, and those who don’t.

Chongsun Ahn | chongs...@villagereach.org

Software Development Engineer

Village****Reach* ** Starting at the Last Mile*

2900 Eastlake Ave. E, Suite 230, Seattle, WA 98102, USA

DIRECT: 1.206.512.1536 **CELL: **1.206.910.0973 FAX: 1.206.860.6972

SKYPE: chongsun.ahn.vr

www.villagereach.org

Connect on Facebook****, Twitter** ** and our Blog

Hi,

  I don't know your requirements for CCE service CSV upload so I won't be able to advise you for or against using/following the pattern from the ref data seed tool.

  For the reference data seed tool itself, it aims to make the transformation rules (CSV to JSON representation) adjutsable via what we call "mapping files". Those mappings define what columns should be translated to which properties and how the transformation from one to another should be handled. It can for example rewrite the value directly into the JSON rep. but it can also take an instance code from the CSV field, find an object of such code and insert its representation to the JSON rep. More about this and all available mappings in the tool's README: The mapping files therefore allow you to easily and quickly modify the transformation rules, without touching the code or even the input CSV files. This is especially useful when the contract for any endpoint changes (eg. field rename). Of course, you would need to modify an input CSV file in case of new fields, if you wanted to make use of them.

  The only things that are hardcoded in the tool are the endpoints for the specific entities and the recognition of an unique instance for update capability (mostly based on the instance code, except for the entities that do not have one). Since the tool is standalone, it uses OpenLMIS endpoints to load the data, rather than going straight to the DB. If you need to have the upload capability directly from the OpenLMIS app, some effort will be required to do that.

  If you need more details or info, we can have a call and I should be able to provide you with any information you need or demo the ref data seeding.

Best regards,

  Sebastian.
···

https://github.com/OpenLMIS/openlmis-refdata-seed#input-files

  On 19.07.2017 11:48, Paweł Albecki wrote:

Hi Chongsun,

      Considering v2 approach, we do not have to use domain/entity objects for make upload structure (with ImportField annotations). Actually, maybe it is a good idea to use for that some transfer objects that will almost never change and just import to entity from that object. This will allow us to change database structure without any effects to upload csv and vice versa. I don't know pattern established by the RefData Seed Tool. Sebastian, can you tell us what capabilities it has? From upload framework we do need easy possibility to adjust header names and make fields non/mandatory. Maybe we should think about combine two approaches and prepare generic library for upload CSV that can be used across services.

Paweł

      On Wednesday, July 19, 2017 at 7:06:32 AM UTC+2, chongsun.ahn wrote:

Hey everyone,

            I have been taking a look at some recent work we have been doing to enable batch import/upload of data into a service and had a question. The example in particular is the cold chain equipment catalog items (CatalogItem) in the CCE service ([https://github.com/OpenLMIS/openlmis-cce](https://github.com/OpenLMIS/openlmis-cce)                ). I see we are following the pattern for v2 (and copying that code), but I am wondering if there is a reason why we are not following the pattern established by the RefData Seed Tool? ([https://github.com/OpenLMIS/openlmis-refdata-seed](https://github.com/OpenLMIS/openlmis-refdata-seed)                ) I know the CCE catalog data that is being uploaded is not reference data, but it seems that we could leverage the code that was created for that tool, and make it generic for any kind of data import. One particular concern I have with recreating the v2 pattern is filling the Java domain/entity objects with import annotations, and tying the CSV upload structure to the database schema.

Shalom,

                        Chongsun

                        ​

                        -- ​

                        There are 10 kinds of people in this world; those who understand binary, and those who don’t.
                            Chongsun Ahn | chongs...@villagereach.org
  •                              Software Development Engineer*
    

Village****Reach* *Starting at the Last Mile

                            2900 Eastlake Ave. E, Suite 230,  Seattle, WA 98102, USA

DIRECT: 1.206.512.1536 **CELL: ** 1.206.910.0973 FAX: 1.206.860.6972

                            SKYPE: chongsun.ahn.vr

www.villagereach.org

                            Connect on **[Facebook](https://www.facebook.com/pages/VillageReach/103205113922)****,** **[Twitter](https://twitter.com/VillageReach)**** **and our **[Blog](http://villagereach.org/see-our-blog/thoughts-from-the-last-mile/)**
  **![](http://www.soldevelo.com/sites/default/files/Soldevelo_logo_EPS_CMYK.png)

      SolDevelo** Sp. z o.o. [LLC] / [www.soldevelo.com](http://www.soldevelo.com)

    Al. Zwycięstwa 96/98, 81-451, Gdynia, Poland

    Phone: +48 58 782 45 40 / Fax: +48 58 782 45 41

  --

  You received this message because you are subscribed to the Google Groups "OpenLMIS Dev" group.

  To unsubscribe from this group and stop receiving emails from it, send an email to openlmis-dev+unsubscribe@googlegroups.com.

  To post to this group, send email to openlmis-dev@googlegroups.com.

  To view this discussion on the web visit [https://groups.google.com/d/msgid/openlmis-dev/5ece102a-5719-4b4e-9b4f-105f1ff42d62%40googlegroups.com](https://groups.google.com/d/msgid/openlmis-dev/5ece102a-5719-4b4e-9b4f-105f1ff42d62%40googlegroups.com?utm_medium=email&utm_source=footer).

  For more options, visit [https://groups.google.com/d/optout](https://groups.google.com/d/optout).


Sebastian Brudziński

    Software Developer / Team Leader


SolDevelo
Sp. z o.o. [LLC] / www.soldevelo.com
Al. Zwycięstwa 96/98, 81-451, Gdynia, Poland
Phone: +48 58 782 45 40 / Fax: +48 58 782 45 41
sbrudzinski@soldevelo.com

This is a good discussion. Paweł, for the CCE catalog item example specifically, if we can create a separate transfer object that can map to the CatalogItem entity, that would alleviate my primary concern about it.

As for having a generic library to import data via CSV, it would be nice to have, but I am not sure if there are any product requirements for it at this point. We have the seed tool for reference data and CCE has some code using the v2 pattern. If other services need a way to batch import/upload, then it might make sense to create a shared library (or a new import microservice).

Shalom,

Chongsun

– ​

There are 10 kinds of people in this world; those who understand binary, and those who don’t.

Chongsun Ahn | chongsun.ahn@villagereach.org

Software Development Engineer

Village****Reach* ** Starting at the Last Mile*

2900 Eastlake Ave. E, Suite 230, Seattle, WA 98102, USA

DIRECT: 1.206.512.1536 **CELL: **1.206.910.0973 FAX: 1.206.860.6972

SKYPE: chongsun.ahn.vr

www.villagereach.org

Connect on Facebook****, Twitter** ** and our Blog

···


https://github.com/OpenLMIS/openlmis-refdata-seed#input-files

On 19.07.2017 11:48, Paweł Albecki wrote:

Hi Chongsun,

Considering v2 approach, we do not have to use domain/entity objects for make upload structure (with ImportField annotations). Actually, maybe it is a good idea to use for that some transfer objects that will almost never change and just import to entity from that object. This will allow us to change database structure without any effects to upload csv and vice versa. I don’t know pattern established by the RefData Seed Tool. Sebastian, can you tell us what capabilities it has? From upload framework we do need easy possibility to adjust header names and make fields non/mandatory. Maybe we should think about combine two approaches and prepare generic library for upload CSV that can be used across services.

Paweł

On Wednesday, July 19, 2017 at 7:06:32 AM UTC+2, chongsun.ahn wrote:

Hey everyone,

I have been taking a look at some recent work we have been doing to enable batch import/upload of data into a service and had a question. The example in particular is the cold chain equipment catalog items (CatalogItem) in the CCE service (https://github.com/OpenLMIS/openlmis-cce ). I see we are following the pattern for v2 (and copying that code), but I am wondering if there is a reason why we are not following the pattern established by the RefData Seed Tool? (https://github.com/OpenLMIS/openlmis-refdata-seed ) I know the CCE catalog data that is being uploaded is not reference data, but it seems that we could leverage the code that was created for that tool, and make it generic for any kind of data import. One particular concern I have with recreating the v2 pattern is filling the Java domain/entity objects with import annotations, and tying the CSV upload structure to the database schema.

Shalom,

Chongsun

– ​

There are 10 kinds of people in this world; those who understand binary, and those who don’t.

Chongsun Ahn | chongs...@villagereach.org

Software Development Engineer

Village****Reach* ** Starting at the Last Mile*

2900 Eastlake Ave. E, Suite 230, Seattle, WA 98102, USA

DIRECT: 1.206.512.1536 **CELL: **1.206.910.0973 FAX: 1.206.860.6972

SKYPE: chongsun.ahn.vr

www.villagereach.org

Connect on Facebook****, Twitter** ** and our Blog

**

SolDevelo** Sp. z o.o. [LLC] /
www.soldevelo.com

Al. Zwycięstwa 96/98, 81-451, Gdynia, Poland

Phone: +48 58 782 45 40 / Fax:
+48 58 782 45 41

You received this message because you are subscribed to the Google Groups “OpenLMIS Dev” group.

To unsubscribe from this group and stop receiving emails from it, send an email to
openlmis-dev+unsubscribe@googlegroups.com.

To post to this group, send email to
openlmis-dev@googlegroups.com.

To view this discussion on the web visit
https://groups.google.com/d/msgid/openlmis-dev/5ece102a-5719-4b4e-9b4f-105f1ff42d62%40googlegroups.com
.

For more options, visit
https://groups.google.com/d/optout
.


Sebastian Brudziński

Software Developer / Team Leader

sbrudzinski@soldevelo.com

Uploading the Ideal Stock Amounts for each product by program and facility is a ticket that is being created for Forecasting: OLMIS-396. I really like the idea of using mapping files and if we do that I think the devs should own creation and management of these…or creating a new import microservice. My two cents.

Thanks,
Sam

···

On Wednesday, July 19, 2017 at 1:08:39 PM UTC-7, chongsun.ahn wrote:

This is a good discussion. Paweł, for the CCE catalog item example specifically, if we can create a separate transfer object that can map to the CatalogItem entity, that would alleviate my primary concern about it.

As for having a generic library to import data via CSV, it would be nice to have, but I am not sure if there are any product requirements for it at this point. We have the seed tool for reference data and CCE has some code using the v2 pattern. If other services need a way to batch import/upload, then it might make sense to create a shared library (or a new import microservice).

Shalom,

Chongsun

– ​

There are 10 kinds of people in this world; those who understand binary, and those who don’t.

Chongsun Ahn | chongs...@villagereach.org

Software Development Engineer

Village****Reach* ** Starting at the Last Mile*

2900 Eastlake Ave. E, Suite 230, Seattle, WA 98102, USA

DIRECT: 1.206.512.1536 **CELL: **1.206.910.0973 FAX: 1.206.860.6972

SKYPE: chongsun.ahn.vr

www.villagereach.org

Connect on Facebook****, Twitter** ** and our Blog

On Jul 19, 2017, at 4:57 AM, Sebastian Brudziński sbrud...@soldevelo.com wrote:

Hi,

I don’t know your requirements for CCE service CSV upload so I won’t be able to advise you for or against using/following the pattern from the ref data seed tool.

For the reference data seed tool itself, it aims to make the transformation rules (CSV to JSON representation) adjutsable via what we call “mapping files”. Those mappings define what columns should be translated to which properties and how the transformation from one to another should be handled. It can for example rewrite the value directly into the JSON rep. but it can also take an instance code from the CSV field, find an object of such code and insert its representation to the JSON rep. More about this and all available mappings in the tool’s README:
https://github.com/OpenLMIS/openlmis-refdata-seed#input-files

The mapping files therefore allow you to easily and quickly modify the transformation rules, without touching the code or even the input CSV files. This is especially useful when the contract for any endpoint changes (eg. field rename). Of course, you would need to modify an input CSV file in case of new fields, if you wanted to make use of them.

The only things that are hardcoded in the tool are the endpoints for the specific entities and the recognition of an unique instance for update capability (mostly based on the instance code, except for the entities that do not have one). Since the tool is standalone, it uses OpenLMIS endpoints to load the data, rather than going straight to the DB. If you need to have the upload capability directly from the OpenLMIS app, some effort will be required to do that.

If you need more details or info, we can have a call and I should be able to provide you with any information you need or demo the ref data seeding.

Best regards,

Sebastian.

On 19.07.2017 11:48, Paweł Albecki wrote:

Hi Chongsun,

Considering v2 approach, we do not have to use domain/entity objects for make upload structure (with ImportField annotations). Actually, maybe it is a good idea to use for that some transfer objects that will almost never change and just import to entity from that object. This will allow us to change database structure without any effects to upload csv and vice versa. I don’t know pattern established by the RefData Seed Tool. Sebastian, can you tell us what capabilities it has? From upload framework we do need easy possibility to adjust header names and make fields non/mandatory. Maybe we should think about combine two approaches and prepare generic library for upload CSV that can be used across services.

Paweł

On Wednesday, July 19, 2017 at 7:06:32 AM UTC+2, chongsun.ahn wrote:

Hey everyone,

I have been taking a look at some recent work we have been doing to enable batch import/upload of data into a service and had a question. The example in particular is the cold chain equipment catalog items (CatalogItem) in the CCE service (https://github.com/OpenLMIS/openlmis-cce ). I see we are following the pattern for v2 (and copying that code), but I am wondering if there is a reason why we are not following the pattern established by the RefData Seed Tool? (https://github.com/OpenLMIS/openlmis-refdata-seed ) I know the CCE catalog data that is being uploaded is not reference data, but it seems that we could leverage the code that was created for that tool, and make it generic for any kind of data import. One particular concern I have with recreating the v2 pattern is filling the Java domain/entity objects with import annotations, and tying the CSV upload structure to the database schema.

Shalom,

Chongsun

– ​

There are 10 kinds of people in this world; those who understand binary, and those who don’t.

Chongsun Ahn | chongs...@villagereach.org

Software Development Engineer

Village****Reach* ** Starting at the Last Mile*

2900 Eastlake Ave. E, Suite 230, Seattle, WA 98102, USA

DIRECT: 1.206.512.1536 **CELL: **1.206.910.0973 FAX: 1.206.860.6972

SKYPE: chongsun.ahn.vr

www.villagereach.org

Connect on Facebook****, Twitter** ** and our Blog

**

SolDevelo** Sp. z o.o. [LLC] /
www.soldevelo.com

Al. Zwycięstwa 96/98, 81-451, Gdynia, Poland

Phone: +48 58 782 45 40 / Fax:
+48 58 782 45 41

You received this message because you are subscribed to the Google Groups “OpenLMIS Dev” group.

To unsubscribe from this group and stop receiving emails from it, send an email to
openlmis-de...@googlegroups.com.

To post to this group, send email to
openl...@googlegroups.com.

To view this discussion on the web visit
https://groups.google.com/d/msgid/openlmis-dev/5ece102a-5719-4b4e-9b4f-105f1ff42d62%40googlegroups.com
.

For more options, visit
https://groups.google.com/d/optout
.


Sebastian Brudziński

Software Developer / Team Leader

sbrud...@soldevelo.com

**

SolDevelo** Sp. z o.o. [LLC] /
www.soldevelo.com

Al. Zwycięstwa 96/98, 81-451, Gdynia, Poland

Phone: +48 58 782 45 40 / Fax: +48 58 782 45 41

You received this message because you are subscribed to the Google Groups “OpenLMIS Dev” group.

To unsubscribe from this group and stop receiving emails from it, send an email to openlmis-dev...@googlegroups.com.

To post to this group, send email to
openl...@googlegroups.com.

To view this discussion on the web visit
https://groups.google.com/d/msgid/openlmis-dev/b6d516af-7f2c-fc3f-69cf-1998601fd2e2%40soldevelo.com
.

For more options, visit https://groups.google.com/d/optout.

We have equivalent of these mapping files in our current approach as well. I think we should give a try for what we have now, when the time will come (and we will need uploads for other micro-services), we can create library based on v2 approach or new microservice based on MW tool. Criteria should be what approach is more generic. The meeting with Sebastian about Seed Tool can be helpful so we can make decision based on our requirements.

Paweł


SolDevelo
Sp. z o.o. [LLC] / www.soldevelo.com
Al. Zwycięstwa 96/98, 81-451, Gdynia, Poland
Phone: +48 58 782 45 40 / Fax: +48 58 782 45 41

···

On Thu, Jul 20, 2017 at 11:26 PM, samim.villagereach@gmail.com wrote:

Uploading the Ideal Stock Amounts for each product by program and facility is a ticket that is being created for Forecasting: OLMIS-396. I really like the idea of using mapping files and if we do that I think the devs should own creation and management of these…or creating a new import microservice. My two cents.

Thanks,
Sam

On Wednesday, July 19, 2017 at 1:08:39 PM UTC-7, chongsun.ahn wrote:

This is a good discussion. Paweł, for the CCE catalog item example specifically, if we can create a separate transfer object that can map to the CatalogItem entity, that would alleviate my primary concern about it.

As for having a generic library to import data via CSV, it would be nice to have, but I am not sure if there are any product requirements for it at this point. We have the seed tool for reference data and CCE has some code using the v2 pattern. If other services need a way to batch import/upload, then it might make sense to create a shared library (or a new import microservice).

Shalom,

Chongsun

– ​

There are 10 kinds of people in this world; those who understand binary, and those who don’t.

Chongsun Ahn | chongs...@villagereach.org

Software Development Engineer

Village****Reach* ** Starting at the Last Mile*

2900 Eastlake Ave. E, Suite 230, Seattle, WA 98102, USA

DIRECT: 1.206.512.1536 **CELL: **1.206.910.0973 FAX: 1.206.860.6972

SKYPE: chongsun.ahn.vr

www.villagereach.org

Connect on Facebook****, Twitter** ** and our Blog

On Jul 19, 2017, at 4:57 AM, Sebastian Brudziński sbrud...@soldevelo.com wrote:

Hi,

I don’t know your requirements for CCE service CSV upload so I won’t be able to advise you for or against using/following the pattern from the ref data seed tool.

For the reference data seed tool itself, it aims to make the transformation rules (CSV to JSON representation) adjutsable via what we call “mapping files”. Those mappings define what columns should be translated to which properties and how the transformation from one to another should be handled. It can for example rewrite the value directly into the JSON rep. but it can also take an instance code from the CSV field, find an object of such code and insert its representation to the JSON rep. More about this and all available mappings in the tool’s README:
https://github.com/OpenLMIS/openlmis-refdata-seed#input-files

The mapping files therefore allow you to easily and quickly modify the transformation rules, without touching the code or even the input CSV files. This is especially useful when the contract for any endpoint changes (eg. field rename). Of course, you would need to modify an input CSV file in case of new fields, if you wanted to make use of them.

The only things that are hardcoded in the tool are the endpoints for the specific entities and the recognition of an unique instance for update capability (mostly based on the instance code, except for the entities that do not have one). Since the tool is standalone, it uses OpenLMIS endpoints to load the data, rather than going straight to the DB. If you need to have the upload capability directly from the OpenLMIS app, some effort will be required to do that.

If you need more details or info, we can have a call and I should be able to provide you with any information you need or demo the ref data seeding.

Best regards,

Sebastian.

On 19.07.2017 11:48, Paweł Albecki wrote:

Hi Chongsun,

Considering v2 approach, we do not have to use domain/entity objects for make upload structure (with ImportField annotations). Actually, maybe it is a good idea to use for that some transfer objects that will almost never change and just import to entity from that object. This will allow us to change database structure without any effects to upload csv and vice versa. I don’t know pattern established by the RefData Seed Tool. Sebastian, can you tell us what capabilities it has? From upload framework we do need easy possibility to adjust header names and make fields non/mandatory. Maybe we should think about combine two approaches and prepare generic library for upload CSV that can be used across services.

Paweł

On Wednesday, July 19, 2017 at 7:06:32 AM UTC+2, chongsun.ahn wrote:

Hey everyone,

I have been taking a look at some recent work we have been doing to enable batch import/upload of data into a service and had a question. The example in particular is the cold chain equipment catalog items (CatalogItem) in the CCE service (https://github.com/OpenLMIS/openlmis-cce ). I see we are following the pattern for v2 (and copying that code), but I am wondering if there is a reason why we are not following the pattern established by the RefData Seed Tool? (https://github.com/OpenLMIS/openlmis-refdata-seed ) I know the CCE catalog data that is being uploaded is not reference data, but it seems that we could leverage the code that was created for that tool, and make it generic for any kind of data import. One particular concern I have with recreating the v2 pattern is filling the Java domain/entity objects with import annotations, and tying the CSV upload structure to the database schema.

Shalom,

Chongsun

– ​

There are 10 kinds of people in this world; those who understand binary, and those who don’t.

Chongsun Ahn | chongs...@villagereach.org

Software Development Engineer

Village****Reach* ** Starting at the Last Mile*

2900 Eastlake Ave. E, Suite 230, Seattle, WA 98102, USA

DIRECT: 1.206.512.1536 **CELL: **1.206.910.0973 FAX: 1.206.860.6972

SKYPE: chongsun.ahn.vr

www.villagereach.org

Connect on Facebook****, Twitter** ** and our Blog

**

SolDevelo** Sp. z o.o. [LLC] /
www.soldevelo.com

Al. Zwycięstwa 96/98, 81-451, Gdynia, Poland

Phone: +48 58 782 45 40 / Fax:
+48 58 782 45 41

You received this message because you are subscribed to the Google Groups “OpenLMIS Dev” group.

To unsubscribe from this group and stop receiving emails from it, send an email to
openlmis-de...@googlegroups.com.

To post to this group, send email to
openl...@googlegroups.com.

To view this discussion on the web visit
https://groups.google.com/d/msgid/openlmis-dev/5ece102a-5719-4b4e-9b4f-105f1ff42d62%40googlegroups.com
.

For more options, visit
https://groups.google.com/d/optout
.

Sebastian Brudziński

Software Developer / Team Leader

sbrud...@soldevelo.com

**

SolDevelo** Sp. z o.o. [LLC] /
www.soldevelo.com

Al. Zwycięstwa 96/98, 81-451, Gdynia, Poland

Phone: +48 58 782 45 40 / Fax: +48 58 782 45 41

You received this message because you are subscribed to the Google Groups “OpenLMIS Dev” group.

To unsubscribe from this group and stop receiving emails from it, send an email to openlmis-dev...@googlegroups.com.

To post to this group, send email to
openl...@googlegroups.com.

To view this discussion on the web visit
https://groups.google.com/d/msgid/openlmis-dev/b6d516af-7f2c-fc3f-69cf-1998601fd2e2%40soldevelo.com
.

For more options, visit https://groups.google.com/d/optout.

You received this message because you are subscribed to the Google Groups “OpenLMIS Dev” group.

To unsubscribe from this group and stop receiving emails from it, send an email to openlmis-dev+unsubscribe@googlegroups.com.

To post to this group, send email to openlmis-dev@googlegroups.com.

To view this discussion on the web visit https://groups.google.com/d/msgid/openlmis-dev/6bc1c224-0899-45d1-8e27-19942b22d8c5%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

Paweł Albecki

    Software Developer

     palbecki@soldevelo.com