Acerca de
Data Engineer
Responsibilities
-
Develop ETL processes that convert data into formats through a team of data analysts and dashboard charts.
-
Be responsible for performance, speed, scalability, and extensibility of any application requiring usage of the pipeline.
-
Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increase data accessibility and foster data-driven decision making across the organization.
Qualifications
-
Bachelor’s Degree in Computer Science, Software Engineering, Information Technology, or equivalent industry experience.
-
Experience in programming languages such as Python, SQL, or Scala.
-
Understand the concepts of Data Lake, Data Warehouse, and Data Mart.
-
Experience in Big Data technologies and their ecosystem such as Hadoop, Spark and Airflow.
-
Experience in building and maintaining reliable and scalable ETL on big data platforms as well as experience working with varied forms of data as structured, semi-structured or unstructured data.
-
Familiar with Cloud Computing Services such as AWS, Azure, or GCP.
-
Understand the process of data pipeline such as collecting, transforming, and publishing data.
-
Understand the tools and design platforms that allow processing of data from multiple heterogeneous sources with different frequencies (batch/real-time).
-
Innovative problem-solving skills with the ability to identify and resolve complex architectural issues.
-
Ability to translate and clearly formulate technical issues Project.
-
Good command of English.
-
A good team player.