Thursday, March 30, 2017

Friday, March 24, 2017

Solving permission denied (publickey) error message on Hadoop name node

For a Hadoop cluster to work properly, all the hadoop components like NameNode, SecondaryNameNode, JobTracker and ProcessManager must be started. If you happen to create a hadoop cluster by yourself like I mentioned in my previous blog, when you start the name node by using '', you might get the below error

' permission denied (publickey)'

This is due to the fact that, by default, the hadoop core process will try to start secondary namenode on the default ip address which by default should be pointing to localhost. In order to do that it tries to establish an SSH connection to the ip address and as the error message states, it fails to make the connection because the passwordless SSH is not setup correctly for the host Below steps will help you fix this

1) If you are using a config file for passwordless ssh (mostly in case of AWS servers)

In order to setup passwordless ssh, you might have created a config file in .ssh folder of your home directory with the entries for hostname, userid and the keyfile to be used while doing ssh, as shown below

Solving jps: command not found error message

jps is a command line utility that comes with jdk which allows you to view all the Java processes running on a host. It is extremely useful when your environment has multiple background java processes running, like it happens in a Hadoop cluster. If you happen to create a hadoop cluster by yourself as mentioned in my previous blogs, 'jps' command is very useful to diagnostic utility to make sure the NameNodes, DataNodes, ProcessManager and ResourceManager processes are running.

Sometimes if you try to use 'jps' and get the error message "jps: command not found error message" here is how you solve it

Wednesday, March 22, 2017

Thursday, March 16, 2017

Setting up a Hadoop Cluster in RHEL 6 - Preparing the servers

In this two part series, we will discuss how to setup a Hadoop cluster on RHEL 6. I was inspired by my friend who did the Big Data Specialization course of University of California San Diego through Coursera and I thought it will be a good learning experience for me to setup a Hadoop cluster myself as well. Even though Cloudera's open source paltform is the most common distribution of Hadoop, I really wanted to understand some low-level details of Hadoop components and how they interact with each other. There fore, I thought it will be useful to install and configure the basic Apache Hadoop distribution from scratch.

In this part, we will discuss the setup of  Linux environment and how to install the Hadoop distribution. In Part 2 we will discuss how to configure the core components of the Hadoop ecosystem and start the server. Okay, lets get started

Wednesday, March 15, 2017

Install and configure PuTTY for making SSH connections

Being able to securely connect to remote hosts from windows using SSH is a very important and common requirement, especially when you are working with remote linux servers. SSH is a high-security protocol. It uses strong encryption to shield your connection from eavesdropping. capture, and other sorts of attack. In this post we will discuss exactly how to do this using PuTTY tool. We will also show how to create and save PuTTY sessions and discuss how to configure login without password using key pair.

PuTTY is a free, open source SSH client and a terminal emulator for windows. It has a very easy to use user interface and the ability to configure and store sessions, so that we can connect to configured hosts in a single click.

Wednesday, October 23, 2013

Resolving Unsatisfied link error in OpenNI java on Windows

If you are getting any of the following error while trying to Run OpenNI on a windows PC, follow the below steps to rid of the errors (I am assuming you are using Eclipse)

java.lang.UnsatisfiedLinkError: C:\Program Files\OpenNI2\Redist\OpenNI2.jni.dll: Can't find dependent libraries


java.lang.UnsatisfiedLinkError: no OpenNI2.jni in java.library.path

Wednesday, February 27, 2013

Connecting MK808 device to ADB using the bridge and OEM USB Drivers

MK808 is a very cheap and handy Android device to play around. Connected to an HDMI monitor, we can easily run Android applications on this device. Before starting, we need to first connect MK808 to the ADB so that we can install/run/debug applications. OEM drivers provided by Google is not sufficient to connect the MK808 device to the ADB. After installing the OEM drivers, please follow the below steps

Get Bitmap image from a YUV in Android

Android Camera preview image is in YUV format. The YUV byte[] will have gray image in the first width*height bytes and color information next. For processing preview image without taking picture from Camera using Camera.takePicture method, mostly we need to convert the YUV byte[] into a Bitmap. Following code will help do that

Blog Archive