Job Type : W2

Experience : 0-2 yrs

Location : Alpharetta, GA

Posted Date : 03-Apr-2018

Description :

  

Client is looking for a SQL/Hive developer who will assist with development across a Hive based Hadoop stack in  Alpharetta, GA for Ongoing Project .  The ideal candidate will work with a team of Big Data developers, data analysts, and business users to develop, implement and fully test the appropriate solutions on a Big Data Platform. 

Responsibilities:

  • Collaborate closely with Product Owners and cross-functional development and business teams to design high-quality integration solutions that are extensible, reusable, and secure.Actively participate in the Scrum meetings and work with Technical Leads to understand user requirements and develop/test technical solutions.
  • Develop REST web services API's supporting both XML and JSON implementing Spring MVC and developed unit test cases using JUnit framework and used Log4j for logging. Work with RESTful APIs, JSON, Identity and Access Management solutions using JSON Web Tokens or OAuth, Azure API Management and Contributed to API Management, Security, Analytics and Continuous Integration.
  • Apply Hibernate Annotations concept to retrieve data from the database and integrate with Spring and JPA to interact with back end SQL Server and display the results in JSON format.  Apply Hibernate queries concept to connect to Database and retrieve the information from DB and design Rest Services to display the output.
  • Create SQL Queries to look after the Data that was stored in MS SQL Server 2017 Management Studio and create complex queries, views, functions and stored procedures. Engage in Core Java concepts like Collections, Multi-Threading and Serialization and perform User Interface validations on client-side.
  • Implement the application using Agile development methodology. Ensure quality through the unit and functional test implementation and execution. Involve in Azure API Management with a deep understanding of API Management configuration/usage/ use cases etc.

 

Requirements:

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science, Information Tecnhology or related technical field      
  • Understanding of Hadoop eco-system technologies.
  • Big Data Application development