This article is migrated from Juejin. Original link: Big Data 01 - Basic Environment Setup

Background

Using three public cloud servers to build Hadoop learning environment:

  • h121: 2C4G
  • h122: 2C4G
  • h123: 2C2G

⚠️ Configure firewall policies properly, for learning purposes only.

Hadoop Components

HDFS (Hadoop Distributed File System)

Distributed file system that splits data into blocks and stores them across different nodes in the cluster, providing high fault tolerance and reliability.

MapReduce

Data processing model, divided into Map phase (splits input data into key-value pairs) and Reduce phase (aggregates data based on keys).

YARN (Yet Another Resource Negotiator)

Resource manager, responsible for scheduling and allocating computing resources in the cluster.

Hadoop Common

Provides common tools and libraries, such as file system abstraction, serialization mechanism, and RPC framework.

Advantages

  • Scalability: Can add nodes to expand computing and storage capacity
  • Fault Tolerance: HDFS replicates data across different nodes
  • Cost Effectiveness: Runs on cheap commodity hardware
  • Flexibility: Processes data in various formats

Java Environment Configuration

# Install OpenJDK 8
sudo apt install openjdk-8-jdk

# Find Java directory
readlink -f $(which java)

# Configure environment variables (add to /etc/profile)
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export PATH=$JAVA_HOME/bin:$PATH

# Refresh environment variables
source /etc/profile

# Verify
java -version

Hadoop Environment Configuration

# Create directories
sudo mkdir /opt/software
sudo mkdir /opt/servers

# Download Hadoop
sudo wget -O hadoop-2.9.2.tar.gz https://archive.apache.org/dist/hadoop/common/hadoop-2.9.2/hadoop-2.9.2.tar.gz

# Extract
sudo tar -zxvf hadoop-2.9.2.tar.gz -C /opt/servers

# Configure environment variables (add to /etc/profile)
export HADOOP_HOME=/opt/servers/hadoop-2.9.2
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin

# Refresh environment variables
source /etc/profile

# Verify
hadoop version

This article was first published on Juejin