Machine Learning, Data and Analytics Engineer, Semiconductor
Google · Taipei, Taiwan
A problem isn’t truly solved until it’s solved for all. That’s why Googlers build products that help create opportunities for everyone, whether down the street or across the globe. As a Program Manager at Google, you’ll lead complex, multi-disciplinary projects from start to finish — working with stakeholders to plan requirements, manage project schedules, identify risks, and communicate clearly with cross-functional partners across the company. Your projects will often span offices, time zones, and hemispheres. It's your job to coordinate the players and keep them up to date on progress and deadlines.
As a Machine Learning Data and Analytics Engineer, you will be the architect of manufacturing intelligence. You will design, build, and maintain the data infrastructure that transforms fragmented information from the global partners into a cohesive, high-performance data ecosystem. Your work will directly enable the operations team to monitor production health, optimize yields, and make data-driven decisions in real-time.
The AI and Infrastructure team is redefining what’s possible. We empower Google customers with breakthrough capabilities and insights by delivering AI and Infrastructure at unparalleled scale, efficiency, reliability and velocity. Our customers include Googlers, Google Cloud customers, and billions of Google users worldwide.We're the driving force behind Google's groundbreaking innovations, empowering the development of our cutting-edge AI models, delivering unparalleled computing power to global services, and providing the essential platforms that enable developers to build the future. From software to hardware our teams are shaping the future of world-leading hyperscale computing, with key teams working on the development of our TPUs, Vertex AI for Google Cloud, Google Global Networking, Data Center operations, systems research, and much more.
Responsibilities
- Design and deploy pipelines to manage high-volume manufacturing data, including wafer maps, test results, and quality reports.
- Build automated tools to clean and normalize disparate data formats from foundry and assembly partners, ensuring a single source of truth.
- Create and maintain intuitive visualizations and dashboards to monitor the Key Performance Indicator (KPIs) and production health metrics.
- Develop and optimize data schemas that support high-speed ingestion and investigative querying for real-time decision-making.
- Partner with Operations and Engineering teams to translate business requirements into technical solutions while ensuring platform reliability and performance.
Minimum qualifications:
- Bachelor's degree or equivalent practical experience.
- 5 years of experience using Python (Pandas, NumPy) or Java to develop data processing tools or automation scripts.
- Experience in managing data workflows using tools like Airflow, dbt, or Prefect.
- Experience in building and querying data within BigQuery, Snowflake, or Redshift environments.
- Experience in developing operational dashboards using Looker, Tableau, or Power BI.
Preferred qualifications:
- Experience working with semiconductor manufacturing data or large-scale industrial datasets.
- Ability to manage complex data exchanges and integration workflows with external foundry or assembly partners.
- Ability to identify manufacturing anomalies and to explain architectures to non-technical stakeholders.