About this role
When BlackRock was started in 1988, its founders envisioned a company that combined the best of financial services with cutting edge technology. They imagined a business that would provide financial services to clients as well as technology services to other financial firms. The result of their vision is Aladdin, our industry leading, end-to-end investment management platform. With assets valued over USD $7 trillion managed on Aladdin, our technology empowers millions of investors to save for retirement, pay for college, buy a home and improve their financial wellbeing.
Data is at the heart of Aladdin and increasingly the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. The DOE team is responsible for the data ecosystem within BlackRock. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, notably investors, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to the firm, powering the future growth of Aladdin.
Data Pipeline Engineers at BlackRock get to experience working at one of the most recognized financial companies in the world while being part of a software development team responsible for next generation technologies and solutions. Our engineers design and build large scale data storage, computation and distribution systems. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers.
We are looking for data engineers who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. We are committed to open source and we regularly give our work back to the community. Engineers looking to work in the areas of orchestration, data modeling, data pipelines, APIs, storage, distribution, distributed computation, consumption and infrastructure are ideal candidates.
Data Pipeline Engineers are expected to be involved from inception of projects, understand requirements, architect, develop, deploy, and maintain data pipelines (ETL / ELT). Typically, they work in a multi-disciplinary squad (we follow Agile!) which involves partnering with program and product managers to expand product offering based on business demands.
Design is an iterative process, whether for UX, data services or infrastructure. Our goal is to drive up user engagement and adoption of the platform while constantly working towards modernizing and improving platform performance and scalability.
Deployment and maintenance require close interaction with various teams. This requires maintaining a positive and collaborative working relationship with teams within DOE as well as with wider Aladdin developer community. Production support for applications is usually required for issues that cannot be resolved by operations team. Creative and inventive problem-solving skills for reduced turnaround times are highly valued.
Preparing user documentation to maintain both development and operations continuity is integral to the role.
Ideal candidate would have
BA/BS in Computer Science or equivalent practical experience
At least 5+ years' experience as a software engineer
Experience in either Python w/ Perl combination or Java -Spring Framework w/ C++ combination
Experience in SQL, Sybase, Linux is a must
Experience with Database Modeling, Normalization techniques is desirable
Experience in delivering Server Side (back-end) applications built with clean coding techniques
Experience with object-oriented design patterns
Experience with dev ops tools like Git, Maven, Jenkins, Gitlab CI, Azure DevOps
Experience with Agile development concepts and related tools
Ability to troubleshoot and fix performance issues across the codebase and database queries
Excellent written and verbal communication skills
Ability to operate in a fast-paced environment
Strong interpersonal skills with a can-do attitude under challenging circumstances
Skills that would be a plus
Exposure with Workflow management tools such as Airflow
Exposure to messaging platforms such as Kafka
Exposure to NoSQL platforms such as Cassandra, MongoDB
Building and Delivering REST APIs
To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.
BlackRock's purpose is to help more and more people experience financial well-being. As a fiduciary to investors and a leading provider of financial technology, we help millions of people build savings that serve them throughout their lives by making investing easier and more affordable.
BlackRock is proud to be an Equal Opportunity and Affirmative Action Employer. We evaluate qualified applicants without regard to race, color, national origin, religion, sex, sexual orientation, gender identity, disability, protected veteran status, and other statuses protected by law.BlackRock will consider for employment qualified applicants with arrest or conviction records in a manner consistent with the requirements of the law, including any applicable fair chance law.