Someone to work effectively and efficiently in a team. You should have good command over communication as well as some more basic traits such as being an effective listener, committed to goals, reliable, intrinsically motivated, and able to multitask as well as you should remain honest and positive during your whole journey.
So choose your team wisely because When you’re working with a great team, every day can feel like an adventure. With a bad team, your office can quickly become a dungeon.
Refer and Earn program is created with the intention to help a job seeker in need and as an appreciation for your help we provide the referral amount to the one who is referring. Creating a mutually beneficial relationship that drives growth and success.
We pay between 3,000-15,000/ to you once the candidate/your reference completes his/her 90 days with the organization. The payout depends upon our contract with the client and the candidate's salary.
Mode of payment: Account Transfer/UPI/Paytm/PhonePe
Please mention the Job name you are interested in under the position tab while applying for the job. If you are looking for another job, kindly do the same.
Looking for an experienced, self-driven, analytical, and strategic Senior Data Engineer. In this role, you will work across a large and complex data lake/warehouse environment. You are passionate about working with disparate datasets in order to bring data together to answer business questions. You should have deep expertise in the creation and management of datasets and the proven ability to translate the data into meaningful insights through collaboration with product managers, Data engineers, business intelligence developer, operation managers and leaders. In this role, you will have ownership of end-to-end development of data engineering solutions to complex questions and you’ll play an integral role in strategic decision-making.
Knowledge & Skills
In this role, you will have the opportunity to display and develop your skills in the following areas:
• Interface with PMs, business customers, and Data Architects/Modelers to understand requirements and implement solutions
• Design, develop, and operate highly-scalable, high-performance, low-cost, and accurate data pipelines in distributed data processing platforms with AWS technologies providing ad hoc access to large datasets and computing power
• Explore and learn the latest AWS big data technologies, evaluate and make decisions around the use of new or existing software products to design the data architecture
• Recognize and adopt best practices in data processing, reporting, and analysis: data integrity, test design, analysis, validation, and documentation Basic Qualifications
• Bachelor’s degree in Computer Science or related technical field, or equivalent work experience.
• 2+ years of work experience with ETL, Data Modeling, and Data Architecture.
• 2+ years of experience with SQL and large data sets, data modeling, ETL development, and data warehousing, or similar skills.
• 2+ year experience with AWS technologies stack including Redshift, RDS, S3, EMR or similar solutions build around Hive/Spark etc. OR 2+ year experience with Azure technologies stack including ADF, Azure BLOB storage, Azure synapse.
Preferred Qualifications
• Excellent in ETL optimization, designing, coding, and tuning big data processes using Apache Spark or similar technologies.
• Experience operating very large data warehouses or data lakes.
• Experience with building data pipelines and applications to stream and process datasets at low latencies.
• Demonstrate efficiency in handling data – tracking data lineage, ensuring data quality, and improving discoverability of data.
• Knowledge of distributed systems and data architecture – design and implement batch and stream data processing pipelines, knows how to optimize the distribution, partitioning, and MPP.
Summary: We are seeking a highly motivated and talented Data Scientist to join our dynamic team. With a passion for data-driven solutions and a strong academic background, the successful candidate will play a key role in supporting our business decisions and driving operational efficiencies through the use of advanced analytics and predictive modelling techniques.
Key Responsibilities:
• Analyse large and complex datasets to uncover key insights and inform business decisions.
• Design and implement predictive models to support the optimization of trading strategies and market forecasting.
• Collaborate with cross-functional teams to identify business needs and develop data-driven solutions
• Stay up to date with the latest advancements in data science and machine learning and apply these to the power trading domain
• Communicate findings and insights to stakeholders in a clear and concise manner.
Requirements:
• 6-9 years of experience in data science or a related field, with a proven track record of success • Strong academic background in computer science, mathematics, statistics, or a related field
• Excellent problem-solving and critical thinking skills
• Proficiency in python programming language.
• Knowledge of machine learning algorithms and data visualization tools
• Excellent communication and collaboration skills, with the ability to work effectively in a fast-paced, cross-functional team environment.
This is an outstanding opportunity for a driven and talented Data Scientist to join a market-leading power trading company and make a significant impact on our business operations. If you possess a passion for data science and a proven track record of success, we encourage you to apply.
At least 5 year of experience in Cloud technologies-AWS development along with DevOps
Experience in implementing DevOps practices and DevOps-tools in areas like CI/CD using Jenkins environment automation, and release automation, virtualization, infra as a code or metrics tracking.
Hands on experience in DevOps tools configuration in different environments.
Strong knowledge of working with DevOps design patterns, processes and best practices
Hand-on experience in Setting up Build pipelines.
Prior working experience in system administration or architecture in Windows or Linux.
Must have experience in GIT (BitBucket, GitHub, GitLab)
Hands-on experience on Jenkins pipeline scripting.
Hands-on knowledge in one scripting language (Nant, Perl, Python, Shell or PowerShell)
Configuration level skills in tools like SonarQube (or similar tools) and Artifactory.
Expertise on Virtual Infrastructure (VMWare or VirtualBox or QEMU or KVM or Vagrant) and environment automation/provisioning using SaltStack/Ansible/Puppet/Chef
Deploying, automating, maintaining and managing AWS cloud based production systems including monitoring capacity. Experience with Python programming is a must.
Good to have experience in migrating code repositories from one source control to another.
Hands-on experience in Docker container and orchestration based deployments like Kubernetes, Service Fabric, Docker swarm.
Must have good communication skills and problem solving skills
Job Description - Senior .Net Developer (.Net, AWS)
Location: Jaipur Experience: 5+ Years in (C#, Cloud)
Key Roles and Responsibilities:
BSc/BE/BTECH in Computer Science, Engineering or a related field.
2-5 years of experience with cloud technologies
3 - 4 years of experience with AWS tech stack (S3, Elasticache, Route 53, EKS, SNS/SQS, Elastic Search, VPC, IAM) Experience in C# (Java or other OO lang experience is fine as well)
Production-level experience working on AWS cloud and related technologies
Hands-on experience dealing with AWS S3, Elasticache, Route 53, Kubernetes (EKS), SNS/SQS and Elastic/Open Search PaaS.
Understanding of VPCs, DNS, IAM, EBS, etc. Experience with elastic load balancers and auto-scaling of infra and services AWS Certified Solution Architect preferred but not required.
Experience in designing and developing Web and API-based technologies on . NET or .NET Core related technologies. Project experience in software design and development on Azure with emphasis on microservices (Azure Kubernetes Service (AKS), Service Fabric) and serverless capabilities (Functions, Event Hub, Service Bus etc…)
Experience with CI/CD pipelines (Azure DevOps), Containers (Docker, Kubernetes), Unit Testing Framework (Jest, Enzyme, NUnit), Modern Web Programming (NodeJs, React), Code Versioning and Integration (GIT, Bitbucket, GitHub), and Security (Standard web security, Single Sign-On, and web service security using security tokens).
Proficiency in front-end technologies (Spring boot, CSS, JavaScript) and modern frameworks (React, Angular, Vue, etc.).
Strong back-end development skills with experience in relevant programming languages (Java, Node.js, etc.).
Strong knowledge of python is must.
Good knowledge of data analytics concepts, tools, and technologies (SQL, data visualization libraries, etc.).
Familiarity with database systems (SQL, NoSQL) and database design principles.
Understanding with cloud platforms (AWS, Azure, GCP) is a plus.
Strong problem-solving skills and the ability to troubleshoot complex technical issues.
Excellent collaboration and communication skills for working in cross-functional teams.
Any experience with AI/ML project is a plus.
4-6 yrs experience of being Salesforce CPQ Developer , CPQ Configurations and Advanced CPQ skills in Customer Scripts, Renewals
NICE TO HAVE -
Aura, LWC, Integrations exp (SOAP, REST, OOB , Salesforce CPQ Engineer x 2- Computer & Networks Client)
Duties:
Customize Salesforce both by configuration and custom development.
Write reusable, testable, and efficient code , Document technical design for new projects and enhancements to existing solutions
Diagnose problems with existing application code and develop technical solutions that resolve the problems, Collaborate with other cross platform team members to build effective integrations, Participate in design sessions and code review
As a Full Stack Java Developer with expertise in Spring Boot, Java, and Angular/React, and working remotely from India under the guidance of a Technical Delivery Manager based in Canada, your responsibilities would be as follows:
Application Development: Design, develop, and maintain applications using Java, Spring Boot, and relevant frameworks. Implement server-side logic, APIs, and database interactions to create robust and scalable solutions.
Front-End Development: Build user-friendly and responsive web interfaces using Angular or React. Collaborate with designers to implement UI/UX designs and ensure seamless integration with the back-end.
CI/CD Pipeline: Configure and manage Continuous Integration/Continuous Deployment (CI/CD) pipelines using tools like GitHub, Git Actions, and cloud-native CI/CD services. Automate build, test, and deployment processes to ensure efficient and reliable delivery.
Cloud Deployment: Deploy applications to various cloud environments such as AWS, Azure, or Google Cloud Platform. Configure and manage infrastructure as code using tools like Terraform or CloudFormation to ensure scalable and resilient deployments.
Unit Testing and Documentation: Write unit tests to validate the functionality of developed modules and document the codebase for ease of maintenance and collaboration.
API Development: Design, develop, and maintain RESTful APIs using Spring Boot. Ensure API security, versioning, and adherence to industry standards. Collaborate with front-end and mobile app developers to provide the required APIs for integration.
Collaboration with Cross-functional Teams: Work closely with the Technical Delivery Manager in Canada, along with designers, product managers, and other developers. Participate in requirement gathering, architecture discussions, and feature implementation to deliver high-quality solutions.
Troubleshooting and Debugging: Identify and resolve issues or bugs in applications, both on the back end and front-end. Use debugging tools and techniques to analyze and resolve problems effectively.
Code Reviews: Actively participate in code reviews, providing feedback to team members and ensuring adherence to coding standards, best practices, and performance optimizations.
Stay Updated with Technologies: Keep up to date with the latest trends, tools, and technologies relevant to full-stack Java development, cloud deployments, and front-end frameworks like Angular or React. Apply emerging technologies where suitable to improve application performance and user experience.