一:CygWin安装
cygwin1.7.15
下载地址
安装省略、(记得安装ssh)
安装完成后 将usr\sbin目录 加入到path环境变量中
二:ssh配置
$ ssh-host-config
*** Query: Should privilege separation be used? (yes/no)
no
*** Query: (Say "no" if it is already installed as a service) (yes/no)yes
*** Query: Enter the value of CYGWIN for the daemon: []
ntsec
*** Query: Do you want to use a different name? (yes/no)
yes
*** Query: Enter the new user name: admin
*** Query: Reenter: admin
*** Query: Create new privileged user account 'admin'? (yes/no)
yes
*** Query: Please enter the password:密码
*** Query: Reenter:重复密码
启动ssh服务
net start sshed
配置无密登录
$ ssh-keygen(win7下 以管理员身份运行)
Enter file in which to save the key (/home/Administrator/.ssh/id_rsa):回车
Enter passphrase (empty for no passphrase):回车
Enter same passphrase again:回车
cd /cygdrive/c/cygwin/home/Administrator/.ssh
(对应cygwin安装目录 例如:D:\cygwin\home\Administrator\.ssh)
cp id_rsa.pub authorized_keys
登录ssh
$ ssh localhost
The authenticity of host 'localhost (127.0.0.1)' can't be established.
ECDSA key fingerprint is 86:07:88:db:34:94:f8:09:6d:f4:7d:19:48:67:fe:e1.
Are you sure you want to continue connecting (yes/no)? yes
三:hadoop配置 启动 (hadoop-1.0.0版本)
1.配置 修改hadoop/conf目录下 4个文件
hadoop-env.shcore-site.xmlhdfs-site.xmlmapred-site.xml
①.hadoop-env.sh
export JAVA_HOME=/cygdrive/d/Java/jdk1.6.0_10
②.conf/core-site.xml:
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
③.conf/hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
④.conf/mapred-site.xml
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
</configuration>
2. 启动
切换到hadoop 安装目录 cd /cygdrive/d/hadoop/hadoop-1.0.0
格式化NameNode bin/hadoop namenode -format
启动hadoop bin/start-all.sh
在hdsf系统中建立 一个名称为test的目录bin/hadoop fs -mkdir test
上传文件 bin/hadoop fs -put *.txt test(Hadoop根目录下的所有文本文件 上传到了test目录)
也可以通过NameNode -http://localhost:50070/上面的Browse the filesystem连接 验证是否上传成功
JobTracker - http://localhost:50030/
hadoop1.0版本 启动时无法启动TaskTracker 解决方案见另一篇:问题记录
分享到:
相关推荐
windows下编译的Hadoop2.7.4,使用方法简单直接解压到本地即可,解决在本地运行mapreduce程序连接不到hadoop客户端的问题。
赠送jar包:hadoop-mapreduce-client-jobclient-2.6.5.jar; 赠送原API文档:hadoop-mapreduce-client-jobclient-2.6.5-javadoc.jar; 赠送源代码:hadoop-mapreduce-client-jobclient-2.6.5-sources.jar; 赠送...
开源思想,少要积分,仅供学习。 Hadoop1.0伪分布式安装步骤。 开源思想,少要积分,仅供学习。 Hadoop1.0伪分布式安装步骤。
Eclipse集成Hadoop2.10.0的插件,使用`ant`对hadoop的jar包进行打包并适应Eclipse加载,所以参数里有hadoop和eclipse的目录. 必须注意对于不同的hadoop版本,` HADDOP_INSTALL_PATH/share/hadoop/common/lib`下的jar包...
赠送jar包:hadoop-mapreduce-client-common-2.6.5.jar; 赠送原API文档:hadoop-mapreduce-client-common-2.6.5-javadoc.jar; 赠送源代码:hadoop-mapreduce-client-common-2.6.5-sources.jar; 赠送Maven依赖信息...
Hadoop 2.7.3 Windows64位 编译bin(包含winutils.exe, hadoop.dll),自己用的,把压缩包里的winutils.exe, hadoop.dll 放在你的bin 目录 在重启eclipse 就好了
hadoop-eclipse-plugin-2.7.3和2.7.7的jar包 hadoop-eclipse-plugin-2.7.3和2.7.7的jar包 hadoop-eclipse-plugin-2.7.3和2.7.7的jar包 hadoop-eclipse-plugin-2.7.3和2.7.7的jar包
赠送jar包:hadoop-yarn-client-2.6.5.jar; 赠送原API文档:hadoop-yarn-client-2.6.5-javadoc.jar; 赠送源代码:hadoop-yarn-client-2.6.5-sources.jar; 赠送Maven依赖信息文件:hadoop-yarn-client-2.6.5.pom;...
赠送jar包:hadoop-yarn-server-resourcemanager-2.6.0.jar; 赠送原API文档:hadoop-yarn-server-resourcemanager-2.6.0-javadoc.jar; 赠送源代码:hadoop-yarn-server-resourcemanager-2.6.0-sources.jar; 赠送...
hadoop-eclipse-plugin-1.2.1hadoop-eclipse-plugin-1.2.1hadoop-eclipse-plugin-1.2.1hadoop-eclipse-plugin-1.2.1
hadoop-common-2.7.3-bin-master 包含hadoop.dll、winutils.exe,下载后解压,配置HADOOP_HOME环境变量
赠送jar包:hadoop-yarn-api-2.5.1.jar; 赠送原API文档:hadoop-yarn-api-2.5.1-javadoc.jar; 赠送源代码:hadoop-yarn-api-2.5.1-sources.jar; 赠送Maven依赖信息文件:hadoop-yarn-api-2.5.1.pom; 包含翻译后...
hadoop-eclipse-plugin-3.1.3,eclipse版本为eclipse-jee-2020-03
hadoop-mapreduce-examples-2.7.1.jar
hadoop-eclipse-plugin-3.1.1, hadoop eclipse 插件 3.1.1
hadoop-common-2.2.0-bin-master(包含windows端开发Hadoop和Spark需要的winutils.exe),Windows下IDEA开发Hadoop和Spark程序会报错,原因是因为如果本机操作系统是windows,在程序中使用了hadoop相关的东西,比如写入...
hadoop-common-2.2.0-bin-master(包含windows端开发Hadoop2.2需要的winutils.exe)
flink-shaded-hadoop-2-uber-2.7.5-10.0.jar
赠送jar包:hadoop-mapreduce-client-jobclient-2.6.5.jar; 赠送原API文档:hadoop-mapreduce-client-jobclient-2.6.5-javadoc.jar; 赠送源代码:hadoop-mapreduce-client-jobclient-2.6.5-sources.jar; 赠送...
hadoop-annotations-3.1.1.jar hadoop-common-3.1.1.jar hadoop-mapreduce-client-core-3.1.1.jar hadoop-yarn-api-3.1.1.jar hadoop-auth-3.1.1.jar hadoop-hdfs-3.1.1.jar hadoop-mapreduce-client-hs-3.1.1.jar ...