At Palo Alto Networks® everything starts and ends with our mission: protecting our way of life in the digital age by preventing successful cyberattacks. It’s not a small goal. It isn’t simple either, but we aren’t in this for the easy answer. As a company with a foundation in challenging the way things are done, we’re looking for innovators with a dedication to best. In return, your career will have a tangible impact – one that's working toward technology that affects every level of society.
Our mission doesn’t happen by treading softly – no, it happens by defining an industry. It means building products that haven't been thought of. It means selling products with a solutions mindset. It means supporting the infrastructure of a company that moves at an incredible speed – intentionally – to stay ahead of the world’s next cyberthreat.
Our daily fight with cyber bad guys requires us to collect and analyze a lot of data…. A LOT of data. And, as our customer base continues its rapid growth, we need to look at faster and more robust tools to help us and our customers make the best decisions possible.
With your knowledge of Hadoop and Big Data technologies, you will add your tools-building superpowers to a small team tasked with building out a DevOps automation environment, one that will step up our Business Analytics game and help us protect our customers from cyber intruders.
We offer the chance to be part of an important mission: ending breaches and protecting our way of digital life. If you are a motivated, intelligent, creative, and hardworking individual, then this job is for you!
You will be responsible for leading technical projects to build custom applications and enhance business systems. We are looking for someone with solid project management and organization skills complemented by a strong technical background. You will work with a team of senior level managers and highly technical engineers to lead complex IT projects. The role will collaborate with multiple engineering and business organizations, understand and align with their needs, and take independent end-to-end responsibility for release of new business capabilities to production.
- As a Big Data Engineer, you will be an integral member of our Big Data and Analytics team responsible for design and development
- Partner with data analyst, product owners and data scientists, to better understand requirements, finding bottlenecks, resolutions, etc.
- Design and develop Big Data solutions both in Cloud & OnPrem
- Design and develop different architectural models for our scalable data processing as well as scalable data storage
- Build data pipelines and ETL using heterogeneous sources using Dataflow or DataProc
- Build data ingestion from various source systems to Hadoop or GCP using Kafka, Flume, Sqoop, Spark Streaming etc.
- Transform data, using data mapping and data processing in Apache Beam or Spark
- Responsible to ensure that the platform goes through Continuous Integration (CI) and Continuous Deployment (CD) with DevOps automation
- Expands and grows data platform capabilities to tackle new data problems and challenges
- Supports Big Data and batch/real time analytical solutions using groundbreaking technologies like Apache Beam
- Have the ability to research and assess open source technologies and components to recommend and integrate into the design and implementation
- Work with development and QA teams to design Ingestion Pipelines, Integration APIs, and provide Hadoop ecosystem services
- Degree in Bachelor of Science in Computer Science or equivalent
- 5+ years of experience with the Hadoop ecosystem and Big Data technologies
- 2+ year of experience in Cloud computing
- Competent in writing Scala, Python or Java code.
- Development experience in Dataflow or DataProc is a Plus
- Ability to dynamically adapt to conventional big data frameworks and tools with the use-cases required by the project
- Experience with building stream-processing systems using solutions such as spark-streaming, Storm or Flink etc.
- Experience in other open-sources like Druid, Elastic Search, Logstash etc. is a plus
- Knowledge of design strategies for developing scalable, resilient, always-on data lake
- Some knowledge of agile(scrum) development methodology is a plus
- Strong development/automation skills
- Excellent inter-personal and teamwork skills
- Can-do attitude on problem solving, quality and ability to execute
Working at a high-tech cybersecurity company within Information Technology is a once in a lifetime opportunity. You’ll be joined with the brightest minds in technology, creating, building, and supporting tools and that enable our global teams on the front line of defense against cyberattacks. We’re joined by one mission – but driven by the impact of that mission and what it means to protect our way of life in the digital age. Join a dynamic and fast-paced team that feels excitement at the prospect of a challenge and feels a thrill at resolving technical gaps that inhibit productivity.
We’re trailblazers that dream big, take risks, and challenge cybersecurity’s status quo. It’s simple: we can’t accomplish our mission without diverse teams innovating, together. To learn more about our dedication to inclusion and innovation, visit our Life at Palo Alto Networks page and our diversity website.
Palo Alto Networks is an equal opportunity employer. We celebrate diversity in our workplace, and all qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or other legally protected characteristics.
Additionally, we are committed to providing reasonable accommodations for all qualified individuals with a disability. If you require assistance or an accommodation due to a disability or special need, please contact us at email@example.com.