I need to push cumulocity’s realtime analytics to Microsoft power BI in order to have historical analytics as well at a specific timer. Which EPL script can be a good start point for this task? same for some python middleware which fetches the data from cumulocity and pushes them to power BI
typically, we do PowerBI integration using data offloaded using Cumulocity DataHub. The integration is documented here:
in EPL Apps there is a sample “Call another microservice” which you could adopt to call an external API. To write a Python script to interact with the C8Y API, you can use this client:
Thanks, would anyone help us in building the EPL which exports a managed object and pushes it to an sql server 2022 Database?
or a python code which retrieves a managed object from Cumulocity API and pushes it to an SQP server 2022 database?
Reading a MO and putting it to SQL Server is a simple task, but I have a hunch you are rather talking about ETL and this is something you should leave to DataHub. So - what are we talking about? All MO of a certain kind (i.e. devices?) What freqyency? Any numbers?
would you please elaborate the ETL idea? so far we need to push to DB a group of MOs (around 6 MOs per device) on a daily basis. So the python code needs to retrieve those managed objects from cumulocity API and then needs to push it to SQL server 2022 database. any python code example about it? at the same time, before pushing to DB, the python code needs to associate this group of MOs to the Device attributes (device name, and other custom fragments) and even time since we are pushing those daily MOs which corresponding to the result end of day before resetting the value of those MOs on cumulocity before the start of the next day
Reading a bunch of managed objects is fairly straightforward using the Cumulocity Python API, e.g. in a single tenant microservice:
from c8y_api.app import SimpleCumulocityApp
c8y = SimpleCumulocityApp()
# read managed objects
mo_ids = []
mos = c8y.inventory.get_all(ids=mo_ids)
# prepare sql
values_sql = ', '.join([f'({mo.id}, {mo.creationDatetime}, {mo.my.custom.value})' for mo in mos])
sql = f'INSERT id, time, my_stuff into {table} VALUES {values_sql}'
The interesting about the above is the c8y.inventory.get_all function which basically provides all select functionality which the underlying REST API supports. I don’t know how you’d get at the object IDs which is why I simply assumed that you know them. See also API Documentation — Cumulocity Python API documentation for more details on this particular method.
Obviously there is a lot more to that - proper error/edge case handling, retention, etc. let along the actual database connectivity (no idea about that). Which is why I’d seriously consider using DataHub for whatever you are trying to achieve.