Apache hadoop development tools (HDT) aims at promoting plugins in Eclipse to assist developers and make their development task simple on Hadoop platform. In this blog, we are going to overview of few features offered by HDT.
HDT is a set of plugins for Eclipse IDE to help developers in building applications against the Hadoop software application developers platform. The plugin offers a list of features within the Eclipse IDE-
The project can be used as a single endpoint for your Zookeeper, HDFS, and MR Cluster. You can simply connect to your Zookeeper or HDFS example and search or add more data to it. MR Cluster is used to submit jobs and determine the status of posted (running) jobs.
HDT is a set of plugins for Eclipse IDE to help developers in building applications against the Hadoop software application developers platform. The plugin offers a list of features within the Eclipse IDE-
- Launching Map-reduces projects on a Hadoop cluster
- Wizards for development of Hadoop based projects
- Wizards for developing Java classes for Reducer/Mapper/Driver, etc.
- Listing online or running jobs on MR cluster
- Browsing or inspecting HDFS nodes
- Browsing or inspecting Zookeeper nodes
The project can be used as a single endpoint for your Zookeeper, HDFS, and MR Cluster. You can simply connect to your Zookeeper or HDFS example and search or add more data to it. MR Cluster is used to submit jobs and determine the status of posted (running) jobs.
Map Reduce templates/ project
You can find a support for developing Hadoop project. You need to point the location of Hadoop and it will pull down all the needed libraries and create eclipse project. This is not an end- you can create Reducer/Mapper/Driver/Partitioner based on the org.apache.hadoop.mapreduce API.
You can find a support for developing Hadoop project. You need to point the location of Hadoop and it will pull down all the needed libraries and create eclipse project. This is not an end- you can create Reducer/Mapper/Driver/Partitioner based on the org.apache.hadoop.mapreduce API.
Eclipse support
The project works well with Eclipse 3.6 version and above. Developers can work on Indigo, Juno, and Kepler as well. Multiple version support At present, projects support two versions of Hadoop development platform- 1.1 and 2.2. It is based on eclipse plugin architecture and supports other existing versions such as 0.23, CDH4, and more in next releases. The tool empowers developers to work with multiple versions of Hadoop from within single IDE. This post was being shared by Hadoop professionals to assist global developers and let them understand about HDT (Hadoop Development Tools) to make best practices for development.
Read More Related This :
Hadoop developers will explain the concept of Hadoop Distribution File System (HDFS) to entire development community across the globe. HDFS is intended to adhere to the conventional distributed file systems and Google File System advantages. The Apache Hadoop framework has its own file system- HDFS. It is a fault-tolerant, self-healing system that is developed in Java to store excessive data (in petabytes or terabytes).
The project works well with Eclipse 3.6 version and above. Developers can work on Indigo, Juno, and Kepler as well. Multiple version support At present, projects support two versions of Hadoop development platform- 1.1 and 2.2. It is based on eclipse plugin architecture and supports other existing versions such as 0.23, CDH4, and more in next releases. The tool empowers developers to work with multiple versions of Hadoop from within single IDE. This post was being shared by Hadoop professionals to assist global developers and let them understand about HDT (Hadoop Development Tools) to make best practices for development.
Read More Related This :
Hadoop developers will explain the concept of Hadoop Distribution File System (HDFS) to entire development community across the globe. HDFS is intended to adhere to the conventional distributed file systems and Google File System advantages. The Apache Hadoop framework has its own file system- HDFS. It is a fault-tolerant, self-healing system that is developed in Java to store excessive data (in petabytes or terabytes).