Job Type : W2
Experience : 2 yrs
Location : Phoenix, AZ
Posted Date : 11-Mar-2020
Description :
We are looking for a candidate who has experience in software development in Splunk and knowledge of Big Data (Hive, Spark, Hadoop).
Responsibilities:
- Analyze, design, develop, test, implement and maintain detailed Splunk solutions using Splunk based applications & Beta Cloud add-ons and technologies including Okta, SNOW and Enterprise Security under multiple operating systems.
- Develop advanced Splunk dashboards, searches and reports to support various internal clients in Security, IT Operations and Application Development.
- Create and Manage Splunk DB connect Identities, Database Connections, Database Inputs, Outputs, KV lookups and access controls.
- Manage Web Application Firewall (WAF) & adept in Python scripting language that Support Splunk infrastructure. Install, test and deploy supporting Splunk monitoring solutions with SNOW services.
- Identify malicious Splunk local device malware activity using the Qualys application and lockdown AMEX access to protect the Infrastructure.
- Detection and workflow for mitigating, Phishing and Suspicious transactions from the Non-AMEX domain using Proof Point Solutions.
- Handle Splunk application logs and shared service (Authentication Service) logs on the data lifecycle. Develop, implement and Support automation & efficiencies with Splunk, introduce new content for alerts and data sources to Cyber Security Analysts.
- Prepare Splunk business Workflow Charts and diagrams to specify detailed operations to be performed by the Application Security Monitoring framework team and executing cluster strategy.
Requirements:
- Minimum of Bachelors degree in Computer Science or Information Technology and or related technical field.
- 2 years of experience in software development in Splunk.
- Experience in knowledge of Splunk (Searching and reporting, Creating Splunk Knowledge Objects - fields, look ups, Data Model, Summary Index etc, Creating dashboards and Views etc)
- Experience in knowledge of Python (as lots of python scripts to be maintained and re-engineered)
- Experience in knowledge of Big Data (Hive, Spark, Hadoop), Unix & shell programming is a plus