Responsibilities & Qualifications
KPMG is currently seeking an Associate for our Data & Analytics- Big Data Software Engineer practice.
While this requisition may state a specific geographic office, please note that our positions are location flexible between our major hubs. Opportunities may include, but are not limited to, Atlanta, Chicago, Dallas, Denver, New York City, Orange County, Philadelphia, Seattle, Washington DC. Please proceed with applying here, and let us know your location preference during interview phase if applicable.
• Under supervising and mentoring of senior team members, rapidly prototype, develop and optimize Data & Analytics (D&A) implementations to tackle the Big Data and Data Science needs for a variety of Fortune 1000 corporations and other major organizations.
• Using waterfall/agile methodology, develop and maintain D&A solutions on-premise, cloud, KPMG-hosted, hybrid infrastructure. Follow software engineering guidelines and industry best practices for code quality; conducting regular design and code review and building technical documentation.
• Play the role of data owner in cross-disciplinary teams. Discover, profile, acquire, process, model, and own data for the solutions.
• Implement data processing pipelines, data mining and data science algorithms, and visualization engineering to help clients distill insights from rich data sources (social media, news, internal or external documents, emails, financial data, client data, and operational data).
• Help in research and experiment of leading and emerging Big Data methodologies (serverless data lake, microservices, Hadoop, Spark, Kafka, AWS, MS Azure, GCP) and apply them to real world client problems.
• From data engineering point of view, help in the process for pursuing innovations, target solutions, and extendable platforms for Lighthouse, KPMG, and client.
• Bachelors, Master's or PhD from an accredited college or university in Computer Science, Computer Engineering, or related fields.
• 1+ years of relevant software development experience in related industries preferred, preferably in professional services.
• Hands-on experience and knowledge in software engineering: waterfall vs agile; object-oriented vs procedural vs functional; source code version control, continuous integration, continuous development/deployment, design patterns etc.
• Proficiency of Linux/Unix/Windows/.NET. Market-leading fluency in several programming languages preferred: Bash/ksh/PowerShell; Python/Perl/R, Java/C/C++/Scala.
• Expertise in distributed computing architecture, massive-parallel processing big data platforms (Hadoop, MapReduce, HDFS, Spark, Hive/Impala, H-Base/MongoDB/Casandra, Teradata/Netezza/Redshift etc.).
• Experience in mainstream cloud infrastructures: AWS, MS Azure and GCP; their D&A-related Microservices, and how to implement data lake and serverless data lake.
• Ability to travel up to 80% of the time.
• Targeted graduation Fall 2018 through Summer 2019
Applicants must be currently authorized to work in the United States without the need for visa sponsorship now or in the future.