Linkedin

Data Engineer (Analyst)

Linkedin
4.1 / 5
PAN India Not disclosed
Yesterday
Remote
Apply to Job

About the job

Job Title: Data Engineer (Analyst) Experience: 2.5 to 5 Years Location: PAN India (Remote/On-site as applicable) About the Role: We are looking for Data Engineer (Analyst) to build and maintain reliable data pipelines and analytics-ready datasets that power BI reporting, product insights, and business decision-making. You’ll work across multiple data sources, model clean reporting layers, and ensure data quality end-to-end. Key Responsibilities: • Build and maintain scalable ETL/ELT pipelines (batch and incremental) using SQL + Python • Integrate data from databases, APIs, SaaS tools, event data, and flat files • Design analytics-ready data models (star schema/marts) for self-serve reporting • Create and optimize transformations in a cloud warehouse/lakehouse (e.g., Snowflake, BigQuery, Redshift, Synapse, Databricks ) • Partner with stakeholders to define KPIs, metric logic, and reporting requirements • Maintain dashboards and reporting outputs in tools like Power BI, Tableau, Looker, or Sigma • Implement data quality checks, monitoring, alerts, and documentation to keep datasets trusted • Tune performance and cost (incremental loads, partitioning, query optimization, file formats) Required Skills: • Strong SQL skills (CTEs, window functions, joins, aggregations, optimization) • Strong Python skills for transformations and automation • Hands-on experience with at least one cloud platform: AWS / Azure / GCP • Experience with a modern data warehouse/lakehouse (Snowflake/BigQuery/Redshift/Synapse/Databricks) • Solid understanding of ETL/ELT patterns (incremental loads, retries, idempotency, basic CDC) • Comfort with data modeling for analytics and BI reporting • Experience building stakeholder-friendly reporting in a BI tool (Power BI/Tableau/Looker/Sigma) Nice to have: • Orchestration tools: Airflow, dbt, Dagster, Prefect, ADF, Glue, etc. • Streaming/event data: Kafka, Kinesis, Pub/Sub • Monitoring/logging: CloudWatch, Azure Monitor, GCP Monitoring, Datadog • CI/CD + Git-based workflows for data pipelines

Requirements

  • SQL
  • Python
  • ETL
  • Data modeling
  • Data visualization

Preferred Technologies

  • SQL
  • Python
  • ETL
  • Data modeling
  • Data visualization

About the company

LinkedIn leverages technology to connect professionals, providing tools for business decision-making, insights and analytics.

Similar Jobs

Luxoft India

Data Engineer

Luxoft India

GurugramNot disclosed
58 minutes agoOn-Site
SAP

Data Engineer

SAP

GurugramNot disclosed
3 weeks agoOn-Site
Crisil

Data Engineer

Crisil

MumbaiNot disclosed
Last weekOn-Site