windows下idea本地调试hadoopspark

    xiaoxiao2022-07-03  115

    hadoop:

    下载hadoop2.7.6.tar.gz,解压设置HADOOP_HOME,将%HADOOP_HOME%\bin\加入Path,解压hadoop2.7.6_hadoop_dll_winutil_exe粘贴到bin目录,将winutils.exe粘贴到System32下在项目的pom.xml中加入以下依赖 <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>2.7.6</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>2.7.6</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>2.7.6</version> </dependency> <dependency> <groupId>jdk.tools</groupId> <artifactId>jdk.tools</artifactId> <version>1.8</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-mapreduce-client-jobclient</artifactId> <version>2.7.6</version> <scope>provided</scope> </dependency> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>4.10</version> </dependency> <dependency> <groupId>org.apache.httpcomponents</groupId> <artifactId>httpclient</artifactId> <version>4.5.2</version> </dependency> <dependency> <groupId>org.json</groupId> <artifactId>json</artifactId> <version>20180130</version> </dependency> <dependency> <groupId>jdk.tools</groupId> <artifactId>jdk.tools</artifactId> <version>1.8</version> <scope>system</scope> <systemPath>${JAVA_HOME}/lib/tools.jar</systemPath> </dependency>

    spark:

    注意conf.setMaster("local")即可注意项目中的错误必须全部排除,不可使用“build,but not check”跳过,否则会报“找不到主类”错误

    两者都需去除线上使用的xml配置文件。

    最新回复(0)