a. Refer to: http://ebiquity.umbc.edu/Tutorials/Hadoop/00%20-%20Intro.html
b. Download Eclipse 3.3.2 Europa: http://www.eclipse.org/downloads/packages/release/europa/winter
c. Download Hadoop 0.19.2: http://apache.osuosl.org//hadoop/core/hadoop-0.19.2/
Setup including the following hadoop-site.xml: http://hadoop.apache.org/common/docs/r0.19.2/quickstart.html#Local
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000/<value>
</property>
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
d. cp [yourpath]/hadoop-0.19.2/contrib/eclipse-plugin/hadoop-0.19.2-eclipse-plugin.jar [yourpath]/eclipse/plugin/
e. start eclipse: Can start eclipse at "File Browser" of Ubuntu
f. Window > open perspective > other > map/reduce
g. In hadoop to start, open 5 terminals and type in the following respectively
~/apache/hadoop-0.19.2/bin/hadoop namenode -format
~/apache/hadoop-0.19.2/bin/hadoop namenode
~/apache/hadoop-0.19.2/bin/hadoop secondarynamenode
~/apache/hadoop-0.19.2/bin/hadoop jobtracker
~/apache/hadoop-0.19.2/bin/hadoop datanode
issue: if there is an error "Unexpected version of storage directory /tmp/hadoop-jongwook/dfs/data", remove "data" folder in the error
~/apache/hadoop-0.19.2/bin/hadoop tasktracker
h. At eclipse: http://ebiquity.umbc.edu/Tutorials/Hadoop/17%20-%20set%20up%20hadoop%20location%20in%20the%20eclipse.html
New hadoop location has the default value as follows as defined at hadoop-site.xml:
map/reduce master: localhost:9001
DFS mater: localhost:9000
user name: jongwook
mapred.job.tracker: localhost:9001
i. How to run Hadoop example at Eclipse
Refer to: http://dal-cloudcomputing.blogspot.com/2009/08/hadoop-example-mymaxtemperaturewithcomb.html
- Create (or import) Hadoop example as shown in the above blog as Hadoop project of Eclipse: File > New > Map/Reduce Project
- 1901 does not exist so that use only 1902
- open Hadoop project perspective view at Eclipse
- You can create DFS folder or upload files to DFS at this view
Saturday, February 26, 2011
Friday, February 25, 2011
The Technical Demand of Cloud Computing (no-SQL DB, Map/Reduce Hadoop)
The Technical Demand of Cloud Computing in Korean
Technical Report granted from KISTI (Korea Institute of Science and Technical Information) 한국과학기술정보연구원
by Jongwook Woo, California State University Los Angeles
Technical Report granted from KISTI (Korea Institute of Science and Technical Information) 한국과학기술정보연구원
by Jongwook Woo, California State University Los Angeles
Labels:
cloud computing,
Column Oriented DB,
hadoop,
map/reduce,
no-SQL DB
Subscribe to:
Posts (Atom)
Followers
Profile
- Dalgual
- PhD, Consultant