Job Code 0
SAP Project Manager with preferably SAP RISE experience, or atleast SAP S/4 HANA experience in a few projects.
Customer currently has one SAP ECC environment and two SAP S/4 HANA environments.
Need to oversee the migration of customer’s legacy environments to SAP RISE.
Biopharmaceutical industry experience, is a plus
Tanya
Technical Recruiter E: Nagella.Tanya@vervenest.com
Web whatsapp :: https://lnkd.in/gsv78Bsk
Linkedin :: https://lnkd.in/gTbrqEa9
Job Code 0
Senior SAP PRA/JVA/GPA Functional (Oil & Gas)
Duration: 3 Years+
Location: Houston, TX ( 50% Onsite alternate weeks, Initial 2 months upto 100% Onsite for Workshops in Transition Phase )
- Experience in SAP IS-Oil Upstream ( PRA Production & Revenue Accounting, JVA Joint Venture Accounting, PSA Production Sharing Accounting ).
- Execute User Stories, including pre-work, for all in-scope Features included in planned Program Increments (PI) for client's PRA/JVA/PSA workstream.
- Produce configuration rationale, FDDs per agreed schedule; close collaboration with SAP development teams to finalize/clarify FDD requirements.
- Oil & Gas industry experience.
Tanya
Technical Recruiter E: Nagella.Tanya@vervenest.com
Web whatsapp :: https://lnkd.in/gsv78Bsk
Linkedin :: https://lnkd.in/gTbrqEa9
Job Code 0
Role :: Data Analyst
Location :: Austin PA
Python experience with 3 - 5 years of experience as a developer / back end engineer OR as a front end engineer.
To Develop reusable, testable and efficient Python code. ¿ Strong Python (specially Pandas) and SQL skills.
Familiarity with software engineering techniques and tools such as version control, testing, logging, GitHub
The ability to interpret complex data using a variety of analytical, statistical or machine learning techniques.
Knowledge of Statistical and Mathematical models like Regression, Clustering, times Series, Classification.
Visualisation packages like plotly, matplotlib Python experience with 3 5 years of experience as a developer / back end engineer OR as a front end engineer.
To Develop reusable, testable and efficient Python code.
Strong Python (specially Pandas) and SQL skills.
Familiarity with software engineering techniques and tools such as version control, testing, logging, GitHub The ability to interpret complex data using a variety of analytical, statistical or machine learning techniques. Knowledge of Statistical and Mathematical models like Regression, Clustering, times Series, Classification. Visualisation packages like plotly, matplotlib
Preferred Skills/Expereince:
Exposure to ML concepts like LSTM, RNN and Conv Nets Experience with sklearn, NLTK, keras and other ML libraries in Python
**Data Science is mandatory
Tanya
Technical Recruiter E: Nagella.Tanya@vervenest.com
Web whatsapp :: https://lnkd.in/gsv78Bsk
Linkedin :: https://lnkd.in/gTbrqEa9
Job Code 0
Role :: GCP Lead Cloud API Developer
Location :: Franklin, TN, 37068
Type: Contract/ Fulltime
Primary skills mandatory: GCP, Java, Microservices, REST API, API development, and GCP services, with hands-on experience in developing secure, scalable APIs
Domain: Manufacturing / Automotive
Job Description:
Looking for experienced GCP Lead Cloud API Developer to lead the design, development, and implementation of cloud-native APIs on Google Cloud Platform (GCP). Candidate should have a strong background in cloud architecture, API development, and GCP services, with hands-on experience in developing secure, scalable APIs. Candidate will work closely with cross-functional/client functional teams to ensure the integration and functionality of cloud services with a focus on performance, security, and scalability.
Key Responsibilities:
API Development:
Lead the design, development, and deployment of secure and scalable RESTful APIs on GCP.
Build APIs to support data ingestion from various platforms and formats (e.g., CSV, JSON streams, etc.).
Implement best practices for API security, authentication, authorization (OAuth2, JWT)
Develop API documentation using OpenAPI/Swagger standards or any as requested by Client.
Cloud Architecture:
Help design and implement GCP-based cloud architectures to support the development of APIs and services.
Select appropriate GCP services for API hosting, storage, data processing (e.g., Cloud Functions, Cloud Run, App Engine, BigQuery, Pub/Sub, Cloud Storage).
Optimize GCP resources for performance, cost-efficiency, and scalability.
Infrastructure Setup:
Help identify required GCP environments, including IAM policies, VPCs, Firewalls, and GCP services.
Implement monitoring and logging using GCP Stackdriver (Cloud Monitoring and Logging) to ensure high availability and reliability.
Ensure infrastructure as code practices using Terraform, Cloud Deployment Manager, or similar tools.
API Gateway and Traffic Management:
Implement/work with team to configure API Gateway for secure access to backend services.
Help identify and set up traffic management, caching, throttling, and quotas for APIs.
Design and implement strategies for API versioning and lifecycle management.
Data Ingestion and Processing:
Develop solutions for real-time and batch data ingestion using Pub/Sub, Dataflow, and other GCP data services.
Work with external systems to pull in data (via CSV uploads, JSON streams, etc.) and process it in the cloud.
Integrate GCP services like BigQuery, Cloud Storage, and Cloud SQL for data storage and analytics.
CI/CD and Automation:
Set up continuous integration and deployment (CI/CD) pipelines using Cloud Build, GitHub Actions, or similar tools for automating API deployments and cloud resource provisioning.
Good to have - automated testing for APIs, including unit, integration, and performance testing.
Security and Compliance:
Ensure the security of the cloud environment through robust IAM policies, VPC setup, encryption, and audit logging.
Implement security best practices, including encryption for data at rest and in transit.
Maintain compliance with industry standards and regulations (e.g., GDPR, HIPAA, SOC2).
Team Leadership:
Provide technical leadership and guidance to the development team.
Collaborate with stakeholders to understand business requirements and translate them into technical solutions.
Mentor and support junior developers, ensuring best practices in coding, cloud architecture, and API development.
Required Skills and Experience:
Cloud Expertise:
5+ years of experience with Google Cloud Platform (GCP), including hands-on experience with core services (Cloud Functions, Cloud Run, App Engine, BigQuery, Pub/Sub, Cloud Storage, Cloud SQL).
Strong understanding of cloud architecture, microservices, and API management in GCP.
API Development:
Expertise in developing and managing RESTful APIs using GCP services.
Experience with API Gateway, OAuth2, JWT, OpenAPI/Swagger for API security and documentation.
Knowledge of API performance optimization and scalability.
Programming Languages:
Proficiency in one or more programming languages such as Python, Java, Node.js, or Go.
Experience with serverless architectures (e.g., Cloud Functions) and containerized applications (e.g., Cloud Run, Kubernetes).
Data Management:
Experience with real-time and batch data ingestion solutions, including Pub/Sub, Dataflow, and BigQuery.
Familiarity with structured and unstructured data storage solutions (e.g., Cloud Storage, BigQuery).
Infrastructure as Code (IaC):
Experience with Infrastructure as Code tools like Terraform, Cloud Deployment Manager, or similar.
CI/CD Automation:
Experience with setting up CI/CD pipelines using Cloud Build, Jenkins, or GitHub Actions for automating API deployment and testing.
Security and Compliance:
Strong knowledge of cloud security best practices, IAM, and VPC networking.
Experience in ensuring compliance with industry standards and regulations (GDPR, HIPAA, etc.).
Preferred Qualifications:
Google Cloud Professional certifications (e.g., Professional Cloud Architect, Professional Data Engineer).
Experience with hybrid or multi-cloud environments.
Familiarity with DevOps and SRE (Site Reliability Engineering) practices.
Experience with big data tools and analytics on GCP (Dataflow, Dataproc, BigQuery).
Tanya
Technical Recruiter E: Nagella.Tanya@vervenest.com
Web whatsapp :: https://lnkd.in/gsv78Bsk
Linkedin :: https://lnkd.in/gTbrqEa9
Job Code 0
Role :: Senior Data Scientist
Location :: Redmond WA
Description
Build & CI/CD: Practical experience with build, CI/CD pipelines, and container frameworks such as Docker and Kubernetes. P0
Azure Stack: Familiar with Azure services such as Azure Functions, Azure Queues, and Azure Container Apps (a plus). P0
Real-time Data Processing: Experienced in building real-time data pipelines and scalable products. P0
Object-Oriented Programming: Proficient in Python and/or .NET, with extensive experience in object-oriented programming, functional programming, and code optimization P0
Databases: Hands-on experience with databases is an advantage (Kusto, Cosmosdb, Cosmos etc.). P1
Proficient in setting up A/B Experimentation using Cosmos and Scope Jobs (Microsoft Big Data Technology) P1
P0 is high priority(must have), P1 is less
Tanya
Technical Recruiter E: Nagella.Tanya@vervenest.com
Web whatsapp :: https://lnkd.in/gsv78Bsk
Linkedin :: https://lnkd.in/gTbrqEa9
Copyright © Aha Posts. All Rights Reserved
Email to us: ahasocialposts@gmail.com
Apply Now!