General Information

Req #
WD00050205
Career area:
Hardware Engineering
Country/Region:
Taiwan
State:
Taipei City
City:
Taipei
Date:
Thursday, March 2, 2023
Working time:
Full-time
Additional Locations
* Taiwan - Taipei City - Taipei

Why Work at Lenovo

Here at Lenovo, we believe in smarter technology that builds a brighter, more sustainable and inclusive future for our customers, colleagues, communities, and the planet.

And we go big. No, not big—huge.

We’re not just a US$70 billion revenue Fortune Global 500 company, we’re one of Fortune’s Most Admired. We’re transforming the world through intelligent transformation, offering the world’s most complete portfolio of smart devices, infrastructure, and solutions. With more than 71,500 employees doing business in 180 markets, we help millions—not just the select few—experience our version of a smarter future.

The one thing that’s missing? Well… you...

Description and Requirements

What You'll Do

The successful candidate will have a key role in designing and developing a strategic data lake that will house all development data for the organization. This position will be key in transforming the business into a data-driven organization. You will have the opportunity to work with a talented team of data engineers, data scientists and SMEs (Subject Matter Experts) together to harness the power of data to invoke innovative engineering design ideas, improve development process efficiency and reduce product cost. You will drive the data lake architecture and data pipeline design to ensure that they provide clean and reliable data for the dashboards and the analytic models. Additionally, this role will provide technical leadership on the data lake architecture design to ensure data accuracy through dashboard, data pipeline monitoring and data quality checking tools.  

Responsibilities Include:

  • Collect, define and document engineering data requirements. 
  • Design and develop the data pipeline to integrate the engineering data into the data lake 
  • Design analytics database schema 
  • Automate and monitor ETL/ELT jobs for analytics database 
  • Design data model to integrate with existing business data 
  • Work with to existing team to integrate, adapt, or identify new tools to efficiently collect, clean, prepare, and store data for analysis 
  • Design and implement data quality checking steps to ensure high data quality for dashboard and ML/AI models 
  • Provide technical and backend configuration support to the engineering applications 

Basic Qualifications: 

  • Bachelor’s degree in Computer Science, Mathematics, Engineering, or in a related field 
  • Experience writing shell scripts, scheduling cron jobs and working with Linux environment.
  • Experience using Airflow or similar data pipeline tools 
  • Experience using Gitlab / GitHub or similar version control tools
  • 4+ years’ experience with object-oriented programming language: Python, Java, etc. 
  • 4+ years’ experience building processes supporting data pipeline, data cleansing / transformation, data quality monitoring 
  • 4+ years’ experience in DB schema, data pipeline design and database management 
  • 4+ years’ experience in optimizing data pipelines, architectures and data sets Fluency in structured and unstructured data and management through modern data transformation methodologies 
  • Experience with engineering data management tools: Cadence, Pulse, Windchill, ELOIS, Creo or similar tools
  • Strong analytical, problem solving, verbal and written communication skills 
  • Not afraid of conflicts, and able to build consensus through direct interaction and compromise
  • Ability to work effectively cross-culturally and across multiple time zones
  • Ability to work with cross-functional teams and stakeholders

Preferred Qualifications: 

  • 4+ years’ experience with designing and managing data in modern ETL architect like Airflow, Spark, Kafka, Hadoop, Snowflakes 
  • 4+ years’ experience in optimizing data pipelines, architectures and data sets fluency in structured and unstructured data and management through modern data transformation methodologies 
  • Experience working with engineering data or similar data 
  • Experience with creating API for MS SQL database 
  • Experience with Microsoft Power Platform 
  • Experience with designing and developing dashboard  
  • A successful history of manipulating, processing and extracting value from large datasets
  • Brings established relationships across Lenovo ISG to the role
  • English language proficiency is preferred, and Mandarin capability is advantaged

Additional Locations
* Taiwan - Taipei City - Taipei
* Taiwan
* Taiwan - Taipei City
* Taiwan - Taipei City - Taipei