Get to know Spry Info Solutions and you'll find a truly different technology company. We are a team of individuals with unique personalities and lifestyles. We believe in bringing together the best and brightest talents we can Find in the industry - people who are passionate about their skills can bring a diverse background to a dynamic team. Our long history, our relationship with our clients.

If you're looking for a company that aligns itself with the value of individual talent within team collaboration in delivering real business and technology solutions, then we're looking for you.

We offer consulting and permanent opportunities, a challenging -- yet friendly -- work environment, excellent compensation, and a comprehensive benefits package. you are encouraged to manage and develop your career according to your aspirations.

Enquiry & Sales:


Job Posts

Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185


We are looking for a site reliability engineer with an expertise in Splunk configuration, setup and monitoring.


  • Design, develop, document, analyze, create, test and modify the log analytics to maintain different Reports, Dashboards and interfaces to and from external systems.
  • Implement integration to external system. Develop Splunk use cases and proliferate Splunk usage across the enterprise; provide engineering expertise and assistance to the Splunk user.
  • Migrate the data from the old plant to newly spun up plant without having any data loss and business impact.
  • Work with Project managers, business owners to fetch the requirements to get the data out of log files from various sources and provide progress reports, Dashboards and perform code review and peer feedback.
  • Responsible for creating Data Models, Pivots, Alerts to access the dependency and usage of the Raw data into usable source of knowledge.
  • Troubleshoot the Splunk product Software system to identify and fix them or turn around for the bugs.
  • Responsible for Functional and Acceptance Testing and to create Test cases and Test Run for Different Users.
  • Actively participating on application development. Supporting the project team for successful delivery of the client’s business requirements through all the phases of the implementation.


Requirements :


The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science, Applications, Engineering or related technical field.

  • Knowledge with monitoring tools such as Splunk, Nagios. Creating and optimizing dashboards/searches 
  • Being able to effectively use a relational database and SQL queries
  • Knowledge of Linux systems, networking, and security.


Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

Position: Software Engineer

Location: Phoenix, AZ

Duration: 21+ Months



  • Develop the Microservices based RESTful API’s using Spring MVC framework by writing controller, service, DAOs classes and business logics using java api and data structures. 

  • Design multiple MapReduce programs for data extraction, transformation and aggregation in XML, JSON, CSV and other compressed file formats.

  • Develop the spark code using Scala and Spark-SQL for data cleaning, pre-processing and aggregation. Apply the spark RDD transformations on top of Hive external tables.Develop complex hive queries for different file sources and writing Hive UDFs. Conduct performance tuning of hive queries. 

  • Coordinate the cluster and shedule workflows using Zookeeper and Oozie. Create HBase-Hive integration tables and loading large sets of semi-structured data coming from various sources. Create Shell scripts to simplify the execution (Hbase, Hive, Oozie, Kafka and MapReduce) and move the data inside and outside of HDFS. 

  • Configure Sqoop and develop scripts to extract data from Teradata into HDFS.

  • Develop multiple Kafka Producers and Consumers by Implementing data ingestion process and handle clusters in real timeTest, build, design, deploy, and maintain the CICD process using tools like Jenkins, Maven Git, Docker and Enterprise Cloud Platform (eCP). 

  • Provide the daily progress in stand-up’s, participate in the code reviews, work effectively with QA in creating test cases and deploy new releases in production


  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science, Applications, Engineering or related technical field.

  • Knowledge in Hadoop, Hive, Data Structures, Core Java, Scritpting, Spark Python, Scala


Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185


Client is looking for a SQL/Hive developer who will assist with development across a Hive based Hadoop stack in  Alpharetta, GA for Ongoing Project .  The ideal candidate will work with a team of Big Data developers, data analysts, and business users to develop, implement and fully test the appropriate solutions on a Big Data Platform. 


  • Collaborate closely with Product Owners and cross-functional development and business teams to design high-quality integration solutions that are extensible, reusable, and secure.Actively participate in the Scrum meetings and work with Technical Leads to understand user requirements and develop/test technical solutions.
  • Develop REST web services API's supporting both XML and JSON implementing Spring MVC and developed unit test cases using JUnit framework and used Log4j for logging. Work with RESTful APIs, JSON, Identity and Access Management solutions using JSON Web Tokens or OAuth, Azure API Management and Contributed to API Management, Security, Analytics and Continuous Integration.
  • Apply Hibernate Annotations concept to retrieve data from the database and integrate with Spring and JPA to interact with back end SQL Server and display the results in JSON format.  Apply Hibernate queries concept to connect to Database and retrieve the information from DB and design Rest Services to display the output.
  • Create SQL Queries to look after the Data that was stored in MS SQL Server 2017 Management Studio and create complex queries, views, functions and stored procedures. Engage in Core Java concepts like Collections, Multi-Threading and Serialization and perform User Interface validations on client-side.
  • Implement the application using Agile development methodology. Ensure quality through the unit and functional test implementation and execution. Involve in Azure API Management with a deep understanding of API Management configuration/usage/ use cases etc.


  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science, Information Tecnhology or related technical field      
  • Understanding of Hadoop eco-system technologies.
  • Big Data Application development 


Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

Client : Sirius XM

Location : Irving, TX

Duration : 12 + Months


Looking for SDET Engineer in Irving texas for Multiple year project ( Initial Start up with 1 year )


  • Decompose and solicit SiriusXM Streaming Services application business requirements and write effective acceptance criteria and develop automated test scripts for different complex cases. 
  • Analyze both Android and iOS platforms to ensure the quality and range of Products spanning Mobile Apps and Web Services for new features. 
  • Coordinate with designers to ensure the designed interfaces are implemented correctly with respect to Android and iOS platforms corresponding to UI rules and patterns. 
  • Work in multithreaded system interacting with various data sources and consumers through REST APIs and other means.
  • Possess technical competency to interact with application developers to ensure software quality, which includes design sessions and code reviews. 
  • Develop automated scripts that support existing Continuous Integration/Continuous Delivery pipeline. 
  • Create and maintain an environment where a Test-Driven Development behavior is fostered. Debug the SiriusXM application SSL Traffic by using Charles Proxy. 
  • Work with Jira, Jenkins, and CI/CD automation pipelines that build and deploy to development, testing, and production environments. 
  • Acts as a quality advocate within Scrum team to ensure standards as necessary throughout the Agile SDLC. 



  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science,Information Technology, Equiavelent or related technical field.
  •  Hands-on experience on Testing and Software Quality Assurance and test automation of mobile applications on iOS and Android platforms
  •  Hands on experience scripting and developing test automation suites.
  • Setup, configure, and maintain the automated testing environments.



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185


 Client : State of GA ( DOC) 

 Location : Atlanta, GA

Location : Long Term



  •  Scribe Application Enhancements and Efficiency Improvement – Making appropriate changes to achieve desired results by modifying Struts/Spring MVC/JavaScript/JQuery/SQL.
  •  Code correctness and Quality Tests - Perform trial runs to ensure they will produce the desired information and that the instructions are correct.
  •  Develop complex programs to handle specific user interface (UI) web pages using web technologies like HTML, CSS, JavaScript and JQuery and integrating with the backend programs using Spring MVC and Struts Framework for the Scribe Application.
  •  Reusable Programming Techniques - Perform repair or modifying of existing programs to increase performance efficiency.
  •  Interaction with cross functional Teams – Interact with engineering, and technical personnel to clarify program intent, identify problems, and suggest changes.
  •  System Synthesis - Perform systems analysis and programming tasks to maintain and control the use of computer systems software as a Software Engineer.
  •  Best Coding Practices - Maintain documentation of the code development and subsequent revisions.  Writing comments at the code modified to help others to understand the change being implemented.
  •  Documentation – Preparing detailed documentation to help the system admin to deploy the program. And also, preparation of workflow diagram to describe the input and flow of the program being deployed.

Required Skills:

  •  The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science,Information Technology or related technical field.
  •  Hands on experience working on Java, Spring Rest API, Jenkins or Maven ,REST & SOAP based Webservices.
  •  Experience developing Web API’s Using REST and its security.  Microservices on Springboot.                                               



Showing 66 to 70 of 90 [Total 18 Pages]

Office Address

Corporate Headquarters

9330 LBJ Freeway, Suite 900, Dallas, TX 75243, USA
Get Directions

Georgia Office:

1081 Cambridge SQ, Suite B, Alpharetta, GA 30009, USA
Get Directions






Business Hours

Monday - Friday

9:00AM - 5:00PM