Careers

Get to know Spry Info Solutions and you'll find a truly different technology company. We are a team of individuals with unique personalities and lifestyles. We believe in bringing together the best and brightest talents we can Find in the industry - people who are passionate about their skills can bring a diverse background to a dynamic team. Our long history, our relationship with our clients.

If you're looking for a company that aligns itself with the value of individual talent within team collaboration in delivering real business and technology solutions, then we're looking for you.

We offer consulting and permanent opportunities, a challenging -- yet friendly -- work environment, excellent compensation, and a comprehensive benefits package. you are encouraged to manage and develop your career according to your aspirations.

Enquiry & Sales: info@spryinfosol.com

Career: hr@spryinfosol.com

Job Posts



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185


We are looking for a Software engineer with an experience in Salesforcec development, Integration Experience, Service cloud, Salesforce Lightening, in O Fallon, MO

Duration : 12+ months project  ( Strong Possibility of Extensions).

 

Responsibilities:

  • Design and develop a Salesforce cloud solutions with a single page Lightning custom application by using web technologies like HTML5, CSS3, JAVSCRIPT, retrieve data using APEX and Integrations using Rest and Soap services.

  • Interact with Business Analyst, Business Partners and Architects to understand and analyze the requirements and transform their markup into a fully fledged cloud web based application.

  • Create conceptual documentation and manage detailed user interface specifications and component sheets.

  • Constant interaction with the QA teams to identify possible bugs and defects beforehand and fixing them before each release to maintain a stable application.

  • Perform system analysis and programming activities,which may require extensive research and analysis on production issues.  Participate in Code deployments and releases.

  • Provide ongoing support, maintenance and enhancement of the SaaS based application. Ensure timely, effective, and quality delivery of software into production, both at an individual and team level.  

 

Requirements :

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science,Information Technology,  Applications, or related technical field.

  • Knowledge in Salesfoce development, Integration Experience, Service cloud, Salesforce Lightening.

 

Read More >>



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

 We are looking for a Splunk engineer with an expertise in Splunk configuration, setup and monitoring for Long term project in Reston VA.

Responsibilities:

 

  • Analyze, design, develop, test, implement and maintain detailed Splunk solutions using various tools and technologies under Multiple Operating Systems.  Develop technical and functional requirements and design specifications to develop custom Splunk application software solutions.

  • Build and maintain a large volume of emails transmitted From/To Executive accounts.

  • Detection and workflow for mitigating Phishing and  Suspicious transactions from Non-AIG domain.  Identify Malicious local device malware activity and lock down AIG access.

  • Manage Web Application Firewall (WAF)  & adept in scripting languages that Support Splunk infrastructure; Install, test and deploy monitoring solutions with Splunk services.

  • Handle Splunk application logs and shared service (Authentication Service) logs  on data lifecycle Support, develop and implement automation and efficiencies with Splunk, introduce new content for alerts and data sources to Analysts.

  • Prepare Workflow Charts and diagrams to specify detailed operations to be performed by the Application Security Monitoring framework and execution strategy.

Requirements :

 

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science or related technical field.

  • Strong Knowledge in Splunk.  Creating and optimizing dashboards/searches. 

  • Technical writing/creation of formal documentation such as diagrams, technical designs, best practices, workflow and processes.

Read More >>



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

 

We are looking for a site reliability engineer with an expertise in Splunk configuration, setup and monitoring.

Responsibilities:

  • Design, develop, document, analyze, create, test and modify the log analytics to maintain different Reports, Dashboards and interfaces to and from external systems.
  • Implement integration to external system. Develop Splunk use cases and proliferate Splunk usage across the enterprise; provide engineering expertise and assistance to the Splunk user.
  • Migrate the data from the old plant to newly spun up plant without having any data loss and business impact.
  • Work with Project managers, business owners to fetch the requirements to get the data out of log files from various sources and provide progress reports, Dashboards and perform code review and peer feedback.
  • Responsible for creating Data Models, Pivots, Alerts to access the dependency and usage of the Raw data into usable source of knowledge.
  • Troubleshoot the Splunk product Software system to identify and fix them or turn around for the bugs.
  • Responsible for Functional and Acceptance Testing and to create Test cases and Test Run for Different Users.
  • Actively participating on application development. Supporting the project team for successful delivery of the client’s business requirements through all the phases of the implementation.

 

Requirements :

 

The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science, Applications, Engineering or related technical field.

  • Knowledge with monitoring tools such as Splunk, Nagios. Creating and optimizing dashboards/searches 
  • Being able to effectively use a relational database and SQL queries
  • Knowledge of Linux systems, networking, and security.

Read More >>



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

Position: Software Engineer

Location: Phoenix, AZ

Duration: 21+ Months

 

Responsibilities:

  • Develop the Microservices based RESTful API’s using Spring MVC framework by writing controller, service, DAOs classes and business logics using java api and data structures. 

  • Design multiple MapReduce programs for data extraction, transformation and aggregation in XML, JSON, CSV and other compressed file formats.

  • Develop the spark code using Scala and Spark-SQL for data cleaning, pre-processing and aggregation. Apply the spark RDD transformations on top of Hive external tables.Develop complex hive queries for different file sources and writing Hive UDFs. Conduct performance tuning of hive queries. 

  • Coordinate the cluster and shedule workflows using Zookeeper and Oozie. Create HBase-Hive integration tables and loading large sets of semi-structured data coming from various sources. Create Shell scripts to simplify the execution (Hbase, Hive, Oozie, Kafka and MapReduce) and move the data inside and outside of HDFS. 

  • Configure Sqoop and develop scripts to extract data from Teradata into HDFS.

  • Develop multiple Kafka Producers and Consumers by Implementing data ingestion process and handle clusters in real timeTest, build, design, deploy, and maintain the CICD process using tools like Jenkins, Maven Git, Docker and Enterprise Cloud Platform (eCP). 

  • Provide the daily progress in stand-up’s, participate in the code reviews, work effectively with QA in creating test cases and deploy new releases in production

Requirements:

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science, Applications, Engineering or related technical field.

  • Knowledge in Hadoop, Hive, Data Structures, Core Java, Scritpting, Spark Python, Scala

Read More >>



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

 

Client is looking for a SQL/Hive developer who will assist with development across a Hive based Hadoop stack in  Alpharetta, GA for Ongoing Project .  The ideal candidate will work with a team of Big Data developers, data analysts, and business users to develop, implement and fully test the appropriate solutions on a Big Data Platform. 

Responsibilities:

  • Collaborate closely with Product Owners and cross-functional development and business teams to design high-quality integration solutions that are extensible, reusable, and secure.Actively participate in the Scrum meetings and work with Technical Leads to understand user requirements and develop/test technical solutions.
  • Develop REST web services API's supporting both XML and JSON implementing Spring MVC and developed unit test cases using JUnit framework and used Log4j for logging. Work with RESTful APIs, JSON, Identity and Access Management solutions using JSON Web Tokens or OAuth, Azure API Management and Contributed to API Management, Security, Analytics and Continuous Integration.
  • Apply Hibernate Annotations concept to retrieve data from the database and integrate with Spring and JPA to interact with back end SQL Server and display the results in JSON format.  Apply Hibernate queries concept to connect to Database and retrieve the information from DB and design Rest Services to display the output.
  • Create SQL Queries to look after the Data that was stored in MS SQL Server 2017 Management Studio and create complex queries, views, functions and stored procedures. Engage in Core Java concepts like Collections, Multi-Threading and Serialization and perform User Interface validations on client-side.
  • Implement the application using Agile development methodology. Ensure quality through the unit and functional test implementation and execution. Involve in Azure API Management with a deep understanding of API Management configuration/usage/ use cases etc.

Requirements:

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science, Information Tecnhology or related technical field      
  • Understanding of Hadoop eco-system technologies.
  • Big Data Application development 

Read More >>


Showing 66 to 70 of 92 [Total 19 Pages]

Office Address

Corporate Headquarters

9330 LBJ Freeway, Suite 900, Dallas, TX 75243, USA
Get Directions

Georgia Office:

1081 Cambridge SQ, Suite B, Alpharetta, GA 30009, USA
Get Directions

Phone

+1(214)561-6706

Fax

+1(214)561-6795

Email

info@spryinfosol.com

Business Hours

Monday - Friday

9:00AM - 5:00PM