Careers

Get to know Spry Info Solutions and you'll find a truly different technology company. We are a team of individuals with unique personalities and lifestyles. We believe in bringing together the best and brightest talents we can Find in the industry - people who are passionate about their skills can bring a diverse background to a dynamic team. Our long history, our relationship with our clients.

If you're looking for a company that aligns itself with the value of individual talent within team collaboration in delivering real business and technology solutions, then we're looking for you.

We offer consulting and permanent opportunities, a challenging -- yet friendly -- work environment, excellent compensation, and a comprehensive benefits package. you are encouraged to manage and develop your career according to your aspirations.

Enquiry & Sales: info@spryinfosol.com

Career: hr@spryinfosol.com

Job Posts



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

 Job Description:

Position: Dot Net Full Stack Developer

Location: South Carolina

Duration: 12 Months + Extensions

Full Stack C#  ASP.NET Developer

Responsibilities:

 

  •  Participate in business and system requirements sessions, requirements elicitation and translation to technical specifications Solution road map for future growth, process architecture and mapping of Disaster Recovery Functions.

  • Design applications based on identified architecture and support implementation of design by resolving complex technical issues faced by the IT project team during development, deployment and support.

  • Develop and maintain web based applications using .NET Framework (C#, VB), WCF, Entity Framework with the knowledge of SQL Server database.

  • Design front-end applications using AngularJS framework, JavaScripts, HTML, Bootstrap and CSS. Develop web services like REST, SOAP and consuming the third party API.

  • Work with cross work streams and determining solution design impacting the core frameworks and components.

  • Perform performance optimizations on .Net frameworks and Shell Scripts. Prepare estimations, release plan and road map for future releases.

  • Define RDBMS models and schemas. Work in performance tuning of database schema, migrations and slowly changing dimensional databases.

  • Prepare & Maintain Technical design document of the code development and subsequent revisions. Prepare workflow diagram to describe the flow of process for deployment by system admin.

Requirements:

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science, Information Technology or related technical field.

  • Should have good knowledge on C#.NET applications, ASP.NET

  • Experience with web technologies like AngularJS, Web API, MVC, HTML, JavaScript, jQuery.

Read More >>



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

Job Description:

Position : Software Engineer - Hadoop

Location: Bentonville, AR

Duration: 1 year + Extensions

 

 

Responsibilities:

  • Installation and configuration the Hadoop platform distributions (cloudera and Hortonworks and MAPR) and Hadoop component services and adding edge nodes and gateway nodes and assign the master and slaves nodes in the cluster.

  • Adding and delete the nodes, connecting the servers through the remote secure shell (ssh) and set up the Rack Awareness

  • Set up HDFS replication factor for replica of data and set up log4j properties and integrated AWS cloud watch. Upgrade and patching the cluster one version to another version (CDH and HDP, MAPR) and patching the Linux servers.

  • Optimize and tune the Hadoop environments to meet performance requirements. Install   and configure monitoring tools for all the critical Hadoop systems and services.

  • Configure and maintain HA of HDFS, YARN (Yet Another Resource Negotiator) Resource Manager, Map Reduce, Hive, HBASE, Kafka and Spark.

  • Manage scalable Hadoop virtual and physical cluster environments. Manage the backup and disaster recovery for Hadoop data.Work in tandem with big data developers and designers for use case specific scalable supportable -infrastructure.

  • Provide very responsive support for day-to-day requests from development, support and business analyst teams.

  • Performance analysis and debugging of slow running development and production processes. Perform product/tool upgrades and apply patches for the identified defects with the root cause analysis (RCA). Perform ongoing capacity management forecasts including timing and budget considerations.

  • Design scripts for Automation of jobs to run in Hadoop environments for Validation checks to monitor the cluster health.

Requirements:

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science, Information Technology or related technical field.

  • Should have good knowledge on Cloudera, Hadoop, HDFS, Hive, Oozie, Spark, Python, Scala, Splunk.

  • Performance tuning of Hadoop clusters.

Read More >>



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

 

Job Description:

Looking for Hadoop Engineer for an 18+ Months Project In Mason, OH.

Responsibilities:

  • Build database prototypes to validate system requirements by discussing with Project managers, Business owners, Analyst teams. Document code and perform code review.

  • Design, develop, validate and deploy the Talend ETL processes for the DWH team using HADOOP (HIVE) on Hadoop.

  • Build data pipeline for different events of ingestion, aggregation and load consumer response data into Hive external tables in HDFS location to serve as feed for several dashboards and Web APIs. Develop SQOOP scripts to migrate data from Oracle to Big data Environment.

  • Design experimental Spark API for better optimization of existing algorithms such as Spark context, Spark SQL, Spark UDF’s, Spark DataFrames. Work with different file formats like CSV, Json, AVRO, text and Parquet and compression techniques    like snappy according to the request of the client.

  • Integrate Spark with MongoDB and create Mongo Collections, consumed by API teams. Convert Hive/SQL queries into Spark transformations using Spark RDDs and Scala.

  • Work on Kafka POC to establish the messages in to Kafka topics and test the frequency of messages. Work on cluster tuning and in-memory computing capabilities of Spark using Scala based on the resources available on the cluster.

  • Develop Shell Scripts to automate the Jobs before moving to Production in a configured way by passing Parameters. Schedule automated jobs on daily basis and weekly basis according to the requirement using Control-M as Scheduler.

  • Work on operation controls like job failure notifications, email notifications for failure logs and exceptions.

  • Support the project team for successful delivery of the client's business requirements through all the phases of the implementation.

 

Requirements:

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science or related technical field.

  • Should have good knowledge on Hadoop eco systems, HDFS, Hive, Oozie, Sqoop, Kafka, Storm, Spark, Scala

  • Should be well versed with SDLC phases, release and change management processes

  • Should have good analytical and problem solving skills.

Read More >>



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

 

Job Description:

Looking Qlik Software Engineer for QlikView/Qlik Sense Development and Administration for advanced applications in Qlik platform for a Long Term Project in Houston, Texas 

 

Responsibilities:

  • Analyze, design, develop, test, implement and maintain detailed Business Intelligence solutions using Qlikview & Qliksense under windows operating systems.

  • Develop technical and functional requirements and design specifications to develop custom Qlikview & Qliksense applications.

  • Create Qlikview & Qliksense extract layers, Logical layers, Star Schema/Snow flake Schema data models and developing functional reports in user interface in Qlikview & Qliksense Application.

  • Provide Qlikview & Qliksense configurations and maintain Qlikview & Qliksense instances and cluster monitoring.

  • Deploy the designed reports into integration and implement the performance test, Response index test, Governance test, User validation of data accuracy.

  • Develop and implement automation and efficiencies with Qlikview & Qliksense API’s (Application Program Interface).

  • Design and develop dashboards and monitor and track Qlikview & Qliksense performance issues; provide strategic support of Qlikview & Qliksense integrations, deployments, and configurations.

  • Prepare workflow charts and diagrams to specify detailed operations to be performed by the Qlik Applications; Plan and prepare technical reports, instructional manuals as documentation of program development.

Requirements:

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science or related technical field.

  • Working experience in Qlikview/Qliksense development and Administration.

  • Innovative thinking and problem solving capabilities in a fast-paced environment

  • Data-driven and customer-focused.

Read More >>



Notice: Undefined index: lang in /home/t26tr3euuhnu/public_html/careers.php on line 185

Job Description:

Looking for a Software Engineer (BI, Qlik) with good knowledge in Business Intelligence tools and required to design, develop and build advanced applications for the Qlik platform using languages including Swift, and experience working with the Qlikview/ Qliksense.

 

Responsibilities:

  • Responsible for design, implementation, analysis, administration and support for the Qlik suite of products (Server, Publisher and Enterprise Management Console).

  • Perform Qlik application Installations, upgrades, Patches, Backups, configuration, Implementing change management principles, monitoring application performance and managing system integration issues.

  • Maintenance of QlikView Server and its Publisher License Management. Regular purging of the QA Server Production. Manage and Track users licensing and maintain Site Administrators.

  • Create ED groups and External User IDs directly within UMA. Manage group associated with both internal user GUIDs and/or client IDs to set ED groups. Perform a periodic review of the users of the ED group with the client for monitoring the user activities.

  • Responsible for user management tasks, including site access role management and assigning users to roles and groups within the site and create external User IDs directly within GUM.

  • Conduct POCs for Qlik and provide strategic recommendations for Qlik usage. Ensure availability meets or exceeds agreed upon SLA's.

  • Coordinate with operational teams to troubleshoot the application issues. Restart all Qlik services, managing the tasks in QMC, Sharing NAM file permissions and folder structure.

  • Ensure compliance with all data governance and retention policies with Analysts, Visualization team and Project Owners. Develop QVW visualization files, SQL Database creation requests and Perform all publishing requests.

  • Assist with performance and tuning of QlikView applications, Server Performance and Health monitoring, Data Source Configuration and Maintain Server Logs.

  • Manage all Client application features and settings, schedules, and search index, Maintain all the operational activities on BI applications. Being up to date with current BI trends in the market.

Requirements:

  • The minimum education requirements to perform the above job duties are a Bachelor’s degree in Computer Science or related technical field.

  • Working experience in Qlikview/Qliksense development

Read More >>


Showing 51 to 55 of 89 [Total 18 Pages]

Office Address

Corporate Headquarters

9330 LBJ Freeway, Suite 900, Dallas, TX 75243, USA
Get Directions

Georgia Office:

1081 Cambridge SQ, Suite B, Alpharetta, GA 30009, USA
Get Directions

Phone

+1(214)561-6706

Fax

+1(214)561-6795

Email

info@spryinfosol.com

Business Hours

Monday - Friday

9:00AM - 5:00PM