Every year, the Los Angeles Rams bring together millions of fans across the country and around the world, as we play for the chance to be named the best team in the NFL. But the Rams are about much more than just football. As an organization, we strive to make Los Angeles better by connecting people through sport, both on and off the field. We are about the legacy of fandom that brings together generations of families year after year, we are about the excitement of sport and live events - an experience that cannot be replicated, we are about our community and making a meaningful impact in the world around us, and we are about ensuring we embody and represent the diversity and uniqueness of our city, Los Angeles. Following the opening of SoFi Stadium, and more recently, winning Super Bowl LVI, the Los Angeles Rams are at an exciting stage of growth and are looking for passionate individuals that have the desire to be a part of something bigger.

Job Responsibilities:

  • Working with Executive, Product Data and Design teams to support their data infrastructure needs while assisting with data-related technical issues

  • Working with data, design, product, and executive teams, assisting them with data-related technical issues

  • Configuring servers or databases in development & production environments

  • Assembling large complex sets of data that meet non-functional and functional business requirements

  • Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes

  • Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using Azure, AWS, GCP and SQL technologies

  • Build analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition

  • Review the average duration of backup, significant changes occurred would need investigation

  • Query optimization for speed and resource usage

  • Extraction of data from different sources

  • Monitoring & maintenance of systems

  • Identifying problems, create tickets and work to resolution

  • Data modeling, translation of logical designs to physical storage structures

Required skills:

  • Bachelor’s degree in computer science

  • 3-5 years of related experience required

  • Ability to create cutting edge data structures, stored procedures, and outputs in an ever-changing data environment

  • Ability to build, develop, and maintain Azure data lake & Azure SQL, GCP & AWS environments

  • Build Data Automation

  • Strong MS SQL skills

  • Scripting language Python, PowerShell, Perl, Bash

  • Database integrity and auditing practices

  • Comply with cybersecurity guidelines

  • Maintain HIPAA compliance in data environments

Familiarity with:

  • API ingestion and creation

  • SQL replication and data distribution across data hubs / instances/ servers

  • ETL (Extract Transform Load) best practices and comfortable operating in “Data Factory “

  • Cloud data infrastructure experience AWS, GCP, Azure understanding

Salary Range: $90,000/yr. - $110,000/yr.