Cognizant Exclusive Walk-In Drive | Hiring For Multiple Roles

Cognizant Exclusive Walk-In Drive :-

Cognizant is hiring candidates for the Multiple Roles for the Across India Locations. The complete details about Cognizant Exclusive Walk-In Drive are as follows.

Company Hiring:-Cognizant
Required Education:Graduation
Job Roles:1) Pyspark Developer
2) Snowflake Developer
3) GCP Services
4) MSFT Power BI Developer
5) Azure Databricks
6) AWS Data pipeline
Job Type:Full-Time
Walk-In Drive Location:Indore
Job Role 1 : Pyspark Developer

Required Qualifications & Skills:

  • Possess strong experience in Python for data processing and automation.
  • Demonstrate expertise in Databricks SQL for data analysis and reporting.
  • Have hands-on experience with Databricks Workflows for data integration.
  • Show proficiency in PySpark for large-scale data processing.
  • Experience in Park Operations is a plus providing valuable domain knowledge.
  • Exhibit excellent problem-solving skills and attention to detail.
  • Display strong communication skills to effectively collaborate with team members and stakeholders.
  • Have a proactive approach to learning and staying updated with new technologies.
  • Certifications Required: Databricks Certified Associate Developer for Apache Spark Python Certification

Job Summary:

Seeking a highly skilled Sr. Developer with 4 to 8 years of experience to join our team. The ideal candidate will have expertise in Python Databricks SQL Databricks Workflows and PySpark. Experience in Park Operations is a plus. This role involves developing and optimizing data workflows to support our business objectives and enhance operational efficiency.

Responsibilities:

  • Develop and maintain data workflows using Databricks Workflows to ensure seamless data integration and processing.
  • Utilize Python to create efficient and scalable data processing scripts.
  • Implement and optimize SQL queries within Databricks to support data analysis and reporting needs.
  • Leverage PySpark to handle large-scale data processing tasks and improve performance.
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Provide technical guidance and support to junior developers to foster skill development and knowledge sharing.
  • Conduct code reviews to ensure code quality and adherence to best practices.
  • Troubleshoot and resolve technical issues related to data workflows and processing.
  • Monitor and optimize the performance of data workflows to ensure they meet business requirements.
  • Develop and maintain documentation for data workflows processes and best practices.
  • Stay updated with the latest industry trends and technologies to continuously improve data processing capabilities.
  • Work closely with stakeholders to gather requirements and provide regular updates on project progress.
  • Ensure data security and compliance with company policies and industry regulations.
Apply Link :-Click Here To Apply
Job Role 2 : Snowflake Developer

Required Qualifications & Skills:

  • At least 4+ years of hands-on experience working with Snowflake
  • Proficiency in interpreting query profiles in snowflake
  • Ability to do performance tuning and database development
  • Ability to perform system monitoring and address various issues in the systems
  • Very good understanding of Snowflake security model
  • Proficiency in automation
  • Understanding of various data warehouse related technologies along with their benefits, downsides and best use-case.
  • Great communication and collaboration skills
  • Willingness and commitment to learn other database, automation and Cloud technologies.

Responsibilities:

  • Code migration Production environment check Database management and maintain Code review
  • Change migrations through environments
  • Debugging production related issues
  • Coordination of all changes with broader BI team
  • On-call support
  • New team member on-boarding
  • Problem management and automation of recurring issues
  • Update operational manuals/SOP and technical documents
  • Meet SLA as per the contract requirement
  • Should have basic understanding on admin activities
Apply Link :-Click Here To Apply
Job Role 3 : GCP Services

Required Qualifications & Skills:

  • Bachelor’s degree in Computer Science, Information Technology, or related field.
  • 5+ years of experience with GCP services and cloud infrastructure.
  • Strong experience with Terraform, Deployment Manager, or other Infrastructure as Code (IaC) tools.
  • Proficiency in Linux/Unix environments and scripting languages (Python, Bash, etc.).
  • Experience with GCP services like Compute Engine, Kubernetes (GKE), Cloud Functions, BigQuery, and Pub/Sub.
  • Deep knowledge of networking (VPC, VPN, load balancing) and cloud security best practices.
  • Experience with CI/CD pipelines, version control systems (Git), and automation tools (Jenkins, GitLab, etc.).
  • Strong troubleshooting skills and experience with monitoring and logging tools (e.g., Stackdriver, Prometheus).
  • Excellent problem-solving and communication skills.

Key Responsibilities:

  • Design & Architecture:
    • Architect, design, and deploy GCP cloud infrastructure for new and existing services.
    • Collaborate with development and infrastructure teams to create scalable and secure cloud solutions.
    • Develop best practices and ensure that architectural decisions support system scalability and security.
  • Cloud Management:
    • Manage GCP services including Compute Engine, Cloud Storage, VPC, BigQuery, Cloud SQL, Cloud Functions, and more.
    • Optimize cloud resource utilization, cost management, and ensure high availability.
    • Implement and manage CI/CD pipelines for automated deployment using GCP services.
  • Automation & DevOps:
    • Develop infrastructure as code using Terraform, Deployment Manager, or other relevant tools.
    • Automate cloud infrastructure tasks and ensure high levels of automation across all environments.
    • Ensure continuous integration, continuous delivery (CI/CD), and manage release pipelines.
  • Monitoring & Security:
    • Implement GCP monitoring tools such as Stackdriver to monitor infrastructure and applications.
    • Set up and manage GCP IAM policies, security protocols, and ensure best practices for cloud security.
    • Perform regular security audits and maintain compliance with security standards.
  • Collaboration & Support:
    • Collaborate with cross-functional teams to troubleshoot production issues and support cloud services.
    • Provide leadership and mentorship to junior engineers.
    • Stay updated with the latest GCP services, best practices, and emerging trends.
Apply Link 3:-Click Here To Apply
Job Role 4 : MSFT Power BI Developer

Required Qualifications & Skills:

  • Experience with Power BI: Proven experience with Power BI development, including Power BI Desktop, Power BI Service, and Power Query.
  • Data Modeling & DAX: Strong skills in data modeling, creating complex measures using DAX, and developing calculated columns and tables.
  • SQL Knowledge: Proficiency in SQL for querying relational databases and extracting data from various sources (SQL Server, Azure, etc.).
  • ETL Knowledge: Familiarity with ETL tools and processes to cleanse, transform, and load data.
  • Visualization Best Practices: Expertise in creating intuitive and interactive dashboards with a user-friendly design.
  • Problem Solving: Strong analytical and problem-solving skills with the ability to work independently and as part of a team.
  • Communication Skills: Excellent verbal and written communication skills to interact with stakeholders and present findings.

Key Responsibilities:

  • Develop & Maintain Dashboards: Design, develop, and maintain interactive Power BI reports, dashboards, and visualizations based on business requirements.
  • Data Modeling: Create and maintain data models using Power BI Desktop to enable seamless reporting and analytics. Utilize DAX (Data Analysis Expressions) for creating complex calculations.
  • ETL Processes: Work with the data engineering team to extract, transform, and load (ETL) data from various sources (e.g., SQL Server, Excel, SharePoint, cloud platforms) into Power BI.
  • Data Integration: Integrate data from diverse sources, ensuring data accuracy, integrity, and consistency.
  • Stakeholder Engagement: Collaborate with business users to understand reporting needs and translate them into technical specifications.
  • Performance Optimization: Optimize Power BI reports and dashboards for improved performance, including query optimization, indexing, and data model adjustments.
  • Documentation: Create and maintain clear documentation for all reports, dashboards, and Power BI solutions to ensure sustainability and knowledge sharing.
  • Training & Support: Provide support and training to end-users on Power BI best practices, self-service BI, and report usage.
Apply Link 4:-Click Here To Apply
Job Role 5 : Azure Databricks

Required Qualifications & Skills:

Must Have Skills:

  • Hands on experience with Azure Databricks, Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions.
  • In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts
  • Working experience with Python, Spark, Pyspark.

Good To Have Skills:

  • Knowledge of Dev-Ops processes (CI/CD) and infrastructure as code.
  • Knowledge of Master Data Management (MDM) and Data Quality tools.
  • Experience developing REST APIs.
  • Kafka
  • Knowledge of key machine learning concepts & MLOPS

Responsibilities:

  • Develop and maintain ETL processes using Informatica Cloud CLI to ensure seamless data integration.
  • Oversee the design and implementation of data warehousing solutions to support business intelligence initiatives.
  • Provide expertise in Data Lake Concepts to optimize data storage and retrieval processes.
  • Utilize Informatica Cloud Scheduler to automate data workflows and improve operational efficiency.
  • Implement Cloud DWH solutions to enhance data accessibility and scalability.
  • Write and optimize SQL queries to support data analysis and reporting needs.
  • Collaborate with cross-functional teams to ensure data integration aligns with business requirements.
  • Utilize Unix Shell Scripting to automate routine tasks and improve system performance.
  • Ensure data quality and integrity through rigorous testing and validation processes.
  • Provide technical guidance and support to junior developers and team members.
  • Monitor and troubleshoot data integration issues to ensure timely resolution.
  • Document technical specifications and maintain comprehensive project documentation.
  • Stay updated with the latest industry trends and technologies to continuously improve data integration processes.
Apply Link:-Click Here To Apply
Job Role 6 : AWS Data pipeline

Required Qualifications & Skills:

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • 5+ years of experience in data engineering with hands-on experience in AWS cloud services.
  • Strong experience with AWS services like S3, Redshift, Glue, Lambda, Athena, EMR, and RDS.
  • Proficiency in SQL and experience with database management systems like Redshift, PostgreSQL, or MySQL.
  • Expertise in ETL/ELT processes and tools (e.g., AWS Glue, Informatica, or Talend).
  • Experience with data warehousing, data lakes, and big data processing technologies.
  • Familiarity with infrastructure-as-code (IaC) tools such as AWS CloudFormation or Terraform.
  • Proficiency in scripting languages like Python, Scala, or Java for data processing.
  • Experience with CI/CD pipelines and version control tools such as Git.
  • Understanding of data governance, security, and compliance in the cloud environment.
  • Strong analytical and problem-solving skills with attention to detail.

Key Responsibilities:

  • Design, develop, and maintain scalable data pipelines on AWS using services such as AWS Glue, Lambda, S3, Redshift, Athena, etc.
  • Build and manage ETL/ELT processes for large-scale data processing.
  • Implement best practices for data storage, data warehousing, and data transformation.
  • Collaborate with data scientists, analysts, and other stakeholders to define data requirements and solutions.
  • Ensure data quality, consistency, and governance across systems.
  • Monitor and troubleshoot data pipelines, identifying and resolving issues proactively.
  • Implement data security and compliance standards in line with company policies and industry regulations.
  • Perform data migration from on-premise to AWS cloud infrastructure.
  • Automate data ingestion and transformation using AWS tools and services.
  • Stay updated on the latest AWS services and tools relevant to data engineering and recommend improvements.
Apply Link :-Click Here To Apply

Send your resume to [email protected] to confirm your slot (or) Apply via links below

Address for Walk-In Drive:

  • #Cognizant Technology Solutions
  • 3rd Floor, Brilliant Titanium
  • Vijay Nagar, Indore.
Apply for Other Off-Campus Jobs
Off-Campus JobsApply Link
PhonePeClick here
ConcentrixClick here
ShareChatClick here
EricssonClick here
PTCClick here
ZycusClick here

Top MNCs Hiring ( 100+ Job Openings) , Upload Your Resume 😍
WhatsAppJoin us on
WhatsApp!