Tirane, kerkohet Data Specialist (Middle Level)
Tirane
Tipi
me orar te plote
Fusha | Profili i punes
Informatike | Programim
Gjuha
anglishtitalisht
Eksperienca
vetem me eksperience
Job Overview:
We are seeking a talented Data Specialist to join our team and help drive our data-focused projects to new heights.
Key Responsibilities:
Design, develop, and optimize data pipelines using Databricks.
Collaborate with the data engineering team to implement scalable big data solutions.
Utilize Python frameworks, with a strong emphasis on PySpark, to build and maintain robust data workflows.
Leverage Azure Cloud tools, including:
Azure Data Factory: For orchestrating and automating data movement.
Azure Storage Account: To manage and store large datasets securely.
Azure Logic Apps: For integrating and automating business-critical workflows.
Ensure data security, quality, and integrity across all platforms.
Provide insights and recommendations for data-driven decision-making.
Requirements:
Proven experience with Databricks.
Strong proficiency in PySpark and other Python data processing frameworks.
Hands-on experience with Azure Cloud services, particularly Azure Data Factory, Azure Storage Account, and Logic Apps.
Ability to work independently and in a team-oriented, collaborative environment.
Excellent problem-solving and analytical skills.
Strong communication skills for effective collaboration across teams.
Preferred Knowledge:
Italian language if know is a plus
How to Apply:
Interested candidates should submit their CV detailing their relevant experience to [email protected]
Apply
Tirane
Tipi
me orar te plote
Fusha | Profili i punes
Informatike | Programim
Gjuha
anglishtitalisht
Eksperienca
vetem me eksperience
Job Overview:
We are seeking a talented Data Specialist to join our team and help drive our data-focused projects to new heights.
Key Responsibilities:
Design, develop, and optimize data pipelines using Databricks.
Collaborate with the data engineering team to implement scalable big data solutions.
Utilize Python frameworks, with a strong emphasis on PySpark, to build and maintain robust data workflows.
Leverage Azure Cloud tools, including:
Azure Data Factory: For orchestrating and automating data movement.
Azure Storage Account: To manage and store large datasets securely.
Azure Logic Apps: For integrating and automating business-critical workflows.
Ensure data security, quality, and integrity across all platforms.
Provide insights and recommendations for data-driven decision-making.
Requirements:
Proven experience with Databricks.
Strong proficiency in PySpark and other Python data processing frameworks.
Hands-on experience with Azure Cloud services, particularly Azure Data Factory, Azure Storage Account, and Logic Apps.
Ability to work independently and in a team-oriented, collaborative environment.
Excellent problem-solving and analytical skills.
Strong communication skills for effective collaboration across teams.
Preferred Knowledge:
Italian language if know is a plus
How to Apply:
Interested candidates should submit their CV detailing their relevant experience to [email protected]
Apply