Data engineer use case for a mid-sized renewable energies company
Challenge, Context, Problems to be solved
Based in the UK, the company offers commercial energy assessment across the south of the country in the form of thermal modelling, energy performance certificates, SAP calculations, display energy certificates and more.
The need for automating a lot of processes became clear upfront. Out of the many areas in which engineering around data was necessary, we focused on delivering at one particular target, namely, enhancing their thermal modelling analysis and overheating assessments.
Mission, tools and methodology
Candidates for “digitalization” were both business operations and analytics. On one hand there is part of the business operations that begged for a change. Whereas in the recent past their clients would send project documentation via e-mail, we opted for choosing Google suite, Google sheets and Google apps script for automating the whole deal of project documentation upload.
Secondly, in the analytics front, the company relies heavily on IESVE software, an in depth suite of integrated analysis tools which, by the time the data engineering integration started, lacked a considerable number of features.
After clients upload the corresponding project documents, their data architects, design and run 3D dynamic simulation modeling, generating results and project data then available via an API. However, the enterprise still missed important granularity and solicited more dimensions and new features to be engineered. Tackling each new feature separately allowed us to familiarize ourselves in depth with the depth and breadth of IESVE's API and account thoroughly for all the caveats, particularities, possible shortcomings of such application and to accommodate our resources accordingly.
Upon its completion, we seeked further for the expansion of their analysis, namely, we laid out and came up with new attributes, to be then turned into new features, the calculation of which and its underlying logic then transferred to a set of ETL pipelines resuming all possible combinations of transformation logic into a bank where each of the pipelines gets triggered according to a set of rules curated hand in hand with architects.
Fetching data from both IESVE API and retrieving those computed data points and config parameters for each type of thermal modelling then ensured a successful trigger of ETL logic.
Last, at the right end, down the pipeline, right past the ETL bank we end up with a reporting facility which is in charge of formatting data to suit specific design aesthetics in accordance with various layouts proposed by the company.
For the time this project lasted, an approximate of a month and a half, became crystal clear the power that data engineering and automation brings to clients, easing up work and cutting times in business, operations and analytics, and when accurately assessed, a surplus of new and exciting opportunities arise, not only for full-fledged growth, but for a more nuanced view and understanding of what you can do with data.
Must have Data Engineer Skills
Such relevant profiles must master the following capabilities:
- Build, Parameterize, Maintain data structures and databases.
- Design data processing systems and organizations
- Analyze Data and enable Machine Learning.
- Design for reliability.
- Visualize data and advocate policy.
- Model business processes for analysis.
- Design for security and compliance.
Other specific must have data engineer skills:
- Debugging and problem solving.
- ETL tools and software.
- Cloud data engineering skills – AWS, GCP, Azure, IBM (2+)
- Other skills : Hive, Apache Spark, Mongodb, MySQL, PostgreSQL.