Careers

Get to know Spry Info Solutions and you'll find a truly different technology company. We are a team of individuals with unique personalities and lifestyles. We believe in bringing together the best and brightest talents we can Find in the industry - people who are passionate about their skills can bring a diverse background to a dynamic team. Our long history, our relationship with our clients.

If you're looking for a company that aligns itself with the value of individual talent within team collaboration in delivering real business and technology solutions, then we're looking for you.

We offer consulting and permanent opportunities, a challenging -- yet friendly -- work environment, excellent compensation, and a comprehensive benefits package. you are encouraged to manage and develop your career according to your aspirations.

Enquiry & Sales: info@spryinfosol.com

Career: hr@spryinfosol.com

Job Posts



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

Position: Software Engineer (Hadoop)

Location: Philadelphia 

 

Description :

Looking for a Software Engineer with knowledge in Hadoop and Spark. Experience with data mining and stream processing technologies (Kafka, Spark Streaming, Akka Streams)

Responsibilities:

  • Translate functional and technical requirements into detail Specifications running on AWS using  services EC2, ECS, RDS Aurora Mysql, SQS, SNS, KMS, Athena.

  • Migrate current Prometheus and DARQ applications to manage multi-account AWS cloud environment  containers, scalable computing platforms Docker (ECS), Kubernetes.

  • Develop Spark code using scala and Spark-SQL/Streaming for faster testing and processing of data. Create, optimize and troubleshoot complex SQL queries to retrieve and analyze data from databases such as Redshift, Oracle, MS SQL Server, MySQL and PostgreSQL.

  • Design ETL transformations and jobs using Pentaho Kettle Spoon designer 5.7.12 and Pentaho Data Integration Designer and scheduling them on ETL WFE application Carte Server.

  • Design, code, test and customize RHQ reports for Market systems data and provide data quality solutions to meet client requirements.

  • Develop various complex queries for different data sources Nasdaq Data Warehouse, Revenue Management System and work on performance tuning of queries. Create scripts for automation process for data ingestion.

  •  Build and Deploy artifacts (RPM’s) and services using GitLab pipelines to Dev, QC and Prod Aws accounts in cloud platform.

  •  Validate DARQ data, reports and manage document libraries on collaboration site using Confluence.

 

Requirements :

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science, Information Technology or related technical field.

  • Knowledge in Hadoop, Spark, and Kafka

  • Strong communication skills.

Read More >>



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

Description:

The QlikView developer will play a key technical role in the support and maintenance of the QlikView / Qlik Sense platform. Responsible for supporting our user community in the architecture and design of the reports developed both inside and outside of the BI team. 

Duration : Long Term

 

Responsibilities:

  • Qlik view and Qlik sense point of View working on all support tickets (Incidents, Minor enhancements, service requests and application and platform support).

  • Qlikview and Qliksense platform support Maintain multiple farms consisting of load balanced dashboards and Mashups external Back end application Servers (Dev/QA/UAT/ PROD) monitoring via operations and session Monitoring tool and server performance tuning, capacity planning and resolving the problems.

  • As part of application support design and developed Qlikview and Qliksense applications.

  • Conduct business requirement gathering and discovery workshops to gather the business and functional.

  • Performing quality checks on the end products with all quality metrics and track Qlikview & Qliksense performance issues; provide strategic support of Qlikview & Qliksense integrations, deployments, and configurations.

  • Working with vendor for new releases and do POCs for Qliksense Application Installation and upgrades on Dev, QA, UAT and PROD Environments and Managing and allocating.

  •  Manage deployments Of Qlik applications / Creating streams for new onboarding applications and publish the application and deploy the CSS files. And built QVD’s from Bigdata sources on Qlik view, web configs, Info path forms, Qlik view and Qlik sense services and Infrastructure updates.

  • Create Qlik sense Extensions using HTML CSS style sheets and Java scripts as per branding specifications. Windows server Installation configures CALs, IIS Administration, Maintain group and local Policies.

 

Requirements :

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science, Information Technology or related technical field.

  • Knowledge in Qlikview / Qliksense Development and Administration.

  • Strong communication skills.

  • Data-driven and customer-focused.

Read More >>



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

 

We are seeking a Software Engineer (BI, Splunk) to join our growing team in the Information Services Department. We are currently hiring Splunk Engineer with Business Intelligence and Analytics Knowledge and its tools for our direct customer in the IT based out of Chester, NY.

 

Duration : 12+ months project  ( Strong Possibility of Extensions).

 

Responsibilities:

  •  Installation and configuration of syslog-ng on multiple servers of Linux, AIX, Solaris to maintain the syslog data on a single centralized server.
  • Installation of Forwarders/ Agents on the client servers in the process of pulling the syslog data and actively participating with the development team to integrate different application with Splunk.
  • Create and Maintain of all the configuration files such as inputs, outputs, deployment client, server on the forwarder server to push the data to multiple Splunk indexers.
  • Configure the data retention policy and maintained hot, warm and cold bucket directories with the Splunk environment.
  • Setup and configure search head cluster along with cluster master with multiple search head nodes and managing the search head cluster with deployer.
  • Set indexing property configurations on the data forwarded from different application servers, including time zone offset, custom source type rules. Configure Regex transformations to perform on data extractions.
  • Develop Splunk Dashboards, searches and reporting to support various internal clients in Security, IT Operations and Application Development.
  • Create Dashboards, report, scheduled searches and alerts using XML Knowledge about Splunk architecture and various components (indexer, forwarder, search head, deployment server), Heavy and Universal forwarder, License model.
  • Work with Client engagements and data onboarding and writing alerts, dashboards, reports, lookups using the Search Processing Language (SPL) and Work in creating Splunk based Scheduled Alerts to trigger email notification in case of abnormalities.
  • Create and Manage Splunk DB connect Identities, Database Connections, Database Inputs, Outputs, lookups, access controls.  

 

Requirements :

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science, Information Technology,  Applications, Engineering or related technical field.
  • Strong Knowledge in Splunk Enterprise.

 

Read More >>



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185


We are looking for a Software engineer with an experience in Salesforcec development, Integration Experience, Service cloud, Salesforce Lightening, in O Fallon, MO

Duration : 12+ months project  ( Strong Possibility of Extensions).

 

Responsibilities:

  • Design and develop a Salesforce cloud solutions with a single page Lightning custom application by using web technologies like HTML5, CSS3, JAVSCRIPT, retrieve data using APEX and Integrations using Rest and Soap services.

  • Interact with Business Analyst, Business Partners and Architects to understand and analyze the requirements and transform their markup into a fully fledged cloud web based application.

  • Create conceptual documentation and manage detailed user interface specifications and component sheets.

  • Constant interaction with the QA teams to identify possible bugs and defects beforehand and fixing them before each release to maintain a stable application.

  • Perform system analysis and programming activities,which may require extensive research and analysis on production issues.  Participate in Code deployments and releases.

  • Provide ongoing support, maintenance and enhancement of the SaaS based application. Ensure timely, effective, and quality delivery of software into production, both at an individual and team level.  

 

Requirements :

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science,Information Technology,  Applications, or related technical field.

  • Knowledge in Salesfoce development, Integration Experience, Service cloud, Salesforce Lightening.

 

Read More >>



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

 We are looking for a Splunk engineer with an expertise in Splunk configuration, setup and monitoring for Long term project in Reston VA.

Responsibilities:

 

  • Analyze, design, develop, test, implement and maintain detailed Splunk solutions using various tools and technologies under Multiple Operating Systems.  Develop technical and functional requirements and design specifications to develop custom Splunk application software solutions.

  • Build and maintain a large volume of emails transmitted From/To Executive accounts.

  • Detection and workflow for mitigating Phishing and  Suspicious transactions from Non-AIG domain.  Identify Malicious local device malware activity and lock down AIG access.

  • Manage Web Application Firewall (WAF)  & adept in scripting languages that Support Splunk infrastructure; Install, test and deploy monitoring solutions with Splunk services.

  • Handle Splunk application logs and shared service (Authentication Service) logs  on data lifecycle Support, develop and implement automation and efficiencies with Splunk, introduce new content for alerts and data sources to Analysts.

  • Prepare Workflow Charts and diagrams to specify detailed operations to be performed by the Application Security Monitoring framework and execution strategy.

Requirements :

 

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science or related technical field.

  • Strong Knowledge in Splunk.  Creating and optimizing dashboards/searches. 

  • Technical writing/creation of formal documentation such as diagrams, technical designs, best practices, workflow and processes.

Read More >>


Showing 61 to 65 of 90 [Total 18 Pages]

Office Address

Corporate Headquarters

9330 LBJ Freeway, Suite 900, Dallas, TX 75243, USA
Get Directions

Georgia Office:

1081 Cambridge SQ, Suite B, Alpharetta, GA 30009, USA
Get Directions

Phone

+1(214)561-6706

Fax

+1(214)561-6795

Email

info@spryinfosol.com

Business Hours

Monday - Friday

9:00AM - 5:00PM