JobTarget Logo

Analytics & BI Spclst 3 or Sr at MidAmerican Energy Company – Portland, Oregon

MidAmerican Energy Company
Portland, Oregon, 97201, United States
Posted on
NewIndustries:Energy/Utilities/Gas/Oil/Electric
New job! Apply early to increase your chances of getting hired.

Explore Related Opportunities

About This Position

Position Title: Analytics & BI Spclst 3 or Sr

Location: Portland, OR, United States

Description:
This is a multi-level posting. Candidates may be considered for any of the posted levels, depending on their level of experience and depth of expertise.

You will design, build, and maintain scalable data pipelines and infrastructure to support analytics, reporting, and data science initiatives. You will work closely with cross-functional teams to ensure data is accessible, reliable, and secure across the organization.

Responsibilities:
  • Design and implement scalable data ingestion and transformation frameworks using one or more of the following:
    • Azure services enabling structured, semi-structured, and unstructured data to be efficiently processed and integrated into enterprise data platforms
    • Informatica Power Center & Informatica Cloud
    • Oracle Data Integrator
  • Build and maintain robust ETL/ELT pipelines.
  • Integrate data from diverse sources including on-premises systems, cloud storage, APIs, and streaming platforms.

Informatica Development and Optimization
  • Design, develop, test, and maintain ETL pipelines using Informatica PowerCenter, including performance tuning, error handling, and integration with ControlM scheduling.
  • Participate in the migration from PowerCenter to Informatica Cloud (IICS) by redesigning mappings, optimizing transformations, and supporting secure agent configurations.

Oracle Data Integrator
  • Design, develop, test, and maintain ETL pipelines using Oracle Data Integrator, including performance tuning, error handling, and integration with ControlM scheduling.
  • Experience with Fusion AI Data Platform a plus (Fusion Data Intelligence, Fusion Analytics Warehouse).

Databricks Development and Optimization
  • Develop and optimize notebooks and workflows in Azure Databricks using PySpark, SQL.
  • Implement Delta Lake for efficient data storage, versioning, and ACID transactions.
  • Leverage Databricks features such as Unity Catalog and job orchestration.

Data Modeling and Architecture
  • Design and implement data models (star/snowflake schemas) for analytics and reporting.
  • Collaborate with architects to define data lakehouse architecture and best practices.
  • Hands-on experience implementing and optimizing data solutions using the Medallion Architecture (Bronze, Silver, Gold layers) for scalable and structured data processing

Data Quality and Governance
  • Implement data validation, profiling, and cleansing routines.
  • Ensure compliance with data governance policies, including data lineage and metadata management.

Performance Tuning and Monitoring
  • Monitor and optimize performance various data processes.
  • Troubleshoot and resolve issues related to data latency, job failures, and resource utilization.

Collaboration and Stakeholder Engagement
  • Work closely with data scientists, analysts, and business units to understand data requirements.
  • Translate business needs into technical solutions that are scalable and maintainable.

Security and Compliance
  • Implement role-based access control (RBAC), encryption, and secure data handling practices.
  • Ensure compliance with industry regulations (e.g., NERC CIP, GDPR, HIPAA if applicable).

Documentation and Best Practices
  • Maintain clear documentation of data flows, architecture, and operational procedures.
  • Promote best practices in code versioning, testing, and CI/CD for data engineering.


Qualifications:
Bachelor's degree in information systems, computer science or related technical field; or equivalent work experience. (Typically four years of related, progressive work experience would be needed for candidates applying for this position who do not possess a bachelor's degree.)

Six or more years of experience with advanced knowledge of data architecture, cloud platforms (especially Azure), and enterprise data solutions is required for the sr level.

Proficiency in data engineering tools and platforms, especially Azure Data Factory and Azure Databricks, Informatica Power Center and IICS, Oracle Data Integrator.

Proficiency in Oracle DB, IBM DB2, Azure.

Strong understanding of data modeling, ETL/ELT processes, and performance tuning of enterprise-level applications.

Expert-level knowledge of data-related technologies from architecture to administration, including design, development, optimization, and licensing.

Proven experience working in the utility industry is required

Effective oral and written communication skills, with the ability to collaborate across teams and mentor junior engineers.

Strong analytical and problem-solving abilities.

Ability to prioritize and manage multiple tasks and projects concurrently.
JOB INFO

Job Identification: 10004834

Job Category: Information Technology

Posting Date: 2026-04-23
Job Schedule: Full time

Job Shift: Day

Locations: 1615 Locust St, Des Moines, IA, 50309, US
Business: MidAmerican Energy Company

Job Location

Portland, Oregon, 97201, United States

Frequently asked questions about this position

Similar Jobs In Portland, Oregon

IT Administrator - 365

Marion County
Salem, Oregon

RF Engineer

AscendArc
Beaverton, Oregon

CAILHS Data Engineer (Applications Engineer, Sr.)

Portland, OR (Downtown)
Portland, Oregon

Remote Data Entry Agent - Work From Home

UsaSurveyJobBoard
Portland, Oregon

IT Administrator - Network

Marion County
Salem, Oregon
Continue to apply
Enter your email to continue. You’ll be redirected to the employer’s application.
By clicking Continue, you understand and agree to JobTarget's Terms of Use and Privacy Policy.