PySpark Jobs
Hello everyone, I will need an expert with knowledge of databases and web services to complete some tasks until 6/3/[login to view URL] $70 USD I will send data and analytical details on chat, including step-by-step directions regarding the requirements and links for everything (all in one wordfile)!! Requirements: 1. Good knowledge of Spark SQL, RDD & Dataframe API. Prior works on Databas...
I have a pyspark file locally and find someone who can develop the golang module which creates Dataproc cluster integrated with GKE, submist a job to that cluster and then tears down that after complete job
I have a pyspark file locally and find someone who can develop the golang module which creates Dataproc cluster integrated with GKE, submist a job to that cluster and then tears down that after complete job
I have a pyspark job file locally and would like to run that one on the Dataproc cluster integrated with GKE Please bid if you have experiences in Golang and GCP(Dataproc, GKE)
I am looking for a trainer with 5-6 years of experience as a Hadoop developer with PySpark and (AWS or Azure) optional. I need someone to train me on the above technologies and guide me to get hands-on experience.
Experience in AWS -Lambda ,Glue,Python,Pyspark,KAFKA Daily 6 to 8 Hours of work including 2 hours of client interaction . Payment will be on Monthly basis .
From this data set : transactions data with 2.5 m records Task 1 : Using Sqoop load data in HDFS ( I have done it) Task 2 : From HDFS using pyspark need to create 4 dimension tables and 1 fact tables. Task 3: Those tables should be copied to redshift cluser Task 4: Analyse data using redshift queries (8 queries) At every step there is hint document we need to compare results.