Difficulty: Expert
Estimated Time: 30 minutes

Hello. Today we will be teaching you how to create an image on the Blue Data EPIC platform. In this scenario you will learn how to upgrade Spark image using the Blue Data EPIC Application Workbench on a CentOS base image container.

Prerequisites:


-Basic knowledge on containers
-Linux administration
-Git
-Spark and Hadoop

This scenario is developed by: alt text

You have successfully completed the scenario for upgrading Spark image using an already existing image!!

Upgrading a Spark application image

Step 1 of 9

Step 1 - Creating a base directory

For creating the image in Bluedata EPIC platform, you need to install Bluedata EPIC App workbench.

We have already setup and installed the Blue Data EPIC Application Workbench for this scenario.

For installing Bluedata EPIC App workbench.
Go to this link: Link

To check what version of the Application Workbench is running, please execute the following command:
bdwb --version


Now, to begin let us create the directory. This directory will house all the files and components necessary to create the application image.
mkdir ~/Source
mkdir ~/Source/Spark