Irdeto off campus hiring Data Engineer Intern | fresher

Irdeto is hiring Data Engineer Intern form 2022, 2023, 2024 batch through off campus placement drive for  New Delhi, India location Who will be its candidates?  Those Any Graduates/ Postgraduates, BE, B Tech, MBA, MCA, BCA, ME, M Tech are eligible. You will get the details link for this position below.


Company Name Irdeto
Job Role Data Engineer Intern
Qualification Bachelor’s Degree
Branch BE / B. Tech / BCA / B.Sc. / MTech / MCA /M.Sc.
Batch
2021,2022,2023,2024
Salary Not disclosed
Experience Freshers
Location New Delhi, India

Qualifications

Required Qualifications & Skills:-

Enrolled in a Bachelor's degree or a Master's degree in Computer Science or Data Science or Information Technology or similar related fields. 

  • They should know programming language like Java , Python, 
  • Basic understanding of  programming language and SQL,
  • It  should know about data integration and  ETL concepts.
  • They should have a plus point if they have the experience in Big Data Technology. 
  • Proficient problem-solving abilities with a capacity to work autonomously and within a team.
  • Effective communication skills and a knack for productive collaboration with diverse stakeholders.

Job Description & Responsibilities:

In the role of a Data Engineer Intern, you will be an integral part of our data engineering team, providing crucial support in the development, maintenance, and enhancement of our data infrastructure and pipelines. Your primary focus will revolve around data integration, data transformation, and data quality assurance, ensuring the seamless and reliable flow of data across multiple systems.

Key Responsibilities:

1. Data Integration: Work closely with data engineering and data science teams to comprehend data requirements and acquire data from both internal and external systems. Implement data extraction, transformation, and loading (ETL) processes to guarantee a smooth data flow into our data warehouse.

2. Data Pipeline Development: Contribute to the design and construction of scalable, resilient, and sustainable data pipelines that gather, process, and store data from various sources. You will work with technologies such as Apache Spark, Apache Kafka, and other relevant tools.

3. Data Quality Assurance: Participate in the development and execution of data quality checks and validation procedures to maintain data accuracy, consistency, and completeness.

4. Database Management: Play a role in the management and upkeep of our databases, ensuring data integrity, security, and high availability. Monitor database performance and address issues as they arise.

5. Documentation: Document data engineering processes, data pipelines, and system configurations to facilitate knowledge sharing and maintain well-documented data infrastructure.

6. Automation: Identify opportunities for automating repetitive tasks and processes within data engineering workflows to enhance efficiency and reduce manual efforts.

7. Data Governance: Adhere to data governance and security policies to uphold data privacy and confidentiality in all data-related activities.

8. Collaborative Projects: Collaborate with cross-functional teams on data-related projects, supporting their data requirements and contributing to the success of company initiatives.

9. Continuous Learning: Stay updated with the latest advancements in data engineering technologies and practices and proactively apply this knowledge to enhance data engineering processes.


APPLY LINK BELOW

CLICK HERE

Connect us 

WhatsApp Click Here
Telegram Click Here
Linkedin Click Here

No comments

Powered by Blogger.