Library cp sqljdbc_4.0/enu/sqljdbc4.jar /usr/hdp/current/sqoop-server/lib Following are instructions for movingĭata from SQL Server AdventureWorks to Hive: Of your cluster generating the data set as well as loading data into Power BIĪnother option would be to use Apache Sqoop to loadĭata from an existing RDBMS data source. I used a 5 GB dataset but this isĬonfigurable. I used TPC-DSĭata from the hive-testbench (see pre-reqs). (be careful of data transfer costs whenĪgain, how you get data is up to you. You will need hardware for HDP as well as a Windows or Mac boxĪ virtual environment using VMware or VirtualBox and install the HDP 2.5Ĭloud with either Azure IaaS, HDInsight, or AWS.Īnd utilize a cloud instance of HDP. For HDP I’m running a 4 node cluster in the cloud. I ended up using Windows Server 2016 Evaluation Edition How to setup the environment is up to you. Scroll down to the Add-Ons section Įxercise assumes a non-Kerberoized cluster HDP Installation either cluster or sandbox.
0 Comments
Leave a Reply. |