• Develop and maintain robust data architectures that support business needs and provide reliable data accessibility.
• Collaborate with cross-functional teams to define data requirements and deliver scalable data solutions.
• Implement ETL processes for data extraction, transformation, and loading, ensuring high data quality and integrity.
• Optimize data storage and access strategies for improved performance and efficiency.
• Monitor and troubleshoot data pipeline performance issues, implementing necessary fixes.
• Create comprehensive documentation for data workflows and system architecture.
Requirements
• Bachelor’s degree in Computer Science, Engineering, or a related field.
• 3+ years of experience in data engineering or related roles.
• Proficiency in programming languages such as Python, Java, or Scala.
• Solid experience with SQL databases and NoSQL technologies, such as Cassandra or MongoDB.
• Familiarity with data warehousing solutions and big data technologies (e.g., Hadoop, Spark).
• Strong analytical skills and attention to detail.