Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services

Amazon SageMaker offers several ways to run distributed data processing jobs with Apache Spark, a popular distributed computing framework for big data processing.

You can run Spark applications interactively from Amazon SageMaker Studio by connecting SageMaker Studio notebooks and AWS Glue Interactive Sessions to run Spark jobs with a serverless cluster. With interactive sessions, you can choose Apache Spark or Ray to easily process large datasets, without worrying about cluster management.

Alternately, if you need more control over the environment, you can use a pre-built SageMaker Spark container to run Spark applications as batch jobs on a fully managed distributed cluster with Amazon SageMaker Processing. This option allows you to select several types of instances (compute optimized, memory optimized, and more), the number of nodes in the cluster, and the cluster configuration, thereby enabling greater flexibility for data processing and model training.

Finally, you can run Spark applications by connecting Studio notebooks with Amazon EMR clusters, or by running your Spark cluster on Amazon Elastic Compute Cloud (Amazon EC2).

All these options allow you to generate and store Spark event logs to analyze them through the web-based user interface commonly named the Spark UI, which runs a Spark History Server to monitor the progress of Spark applications, track resource usage, and debug errors.

In this post, we share a solution for installing and running Spark History Server on SageMaker Studio and accessing the Spark UI directly from the SageMaker Studio IDE, for analyzing Spark logs produced by different AWS services (AWS Glue Interactive Sessions, SageMaker Processing jobs, and Amazon EMR) and stored in an Amazon Simple Storage Service (Amazon S3) bucket.

Solution overview

The solution integrates Spark History Server into the Jupyter Server app in SageMaker Studio. This allows users to access Spark logs directly from the SageMaker Studio IDE. The integrated Spark History Server supports the following:

  • Accessing logs generated by SageMaker Processing Spark jobs
  • Accessing logs generated by AWS Glue Spark applications
  • Accessing logs generated by self-managed Spark clusters and Amazon EMR

A utility command line interface (CLI) called sm-spark-cli is also provided for interacting with the Spark UI from the SageMaker Studio system terminal. The sm-spark-cli enables managing Spark History Server without leaving SageMaker Studio.

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.

The solution consists of shell scripts that perform the following actions:

  • Install Spark on the Jupyter Server for SageMaker Studio user profiles or for a SageMaker Studio shared space
  • Install the sm-spark-cli for a user profile or shared space

Install the Spark UI manually in a SageMaker Studio domain

To host Spark UI on SageMaker Studio, complete the following steps:

  1. Choose System terminal from the SageMaker Studio launcher.

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.

  1. Run the following commands in the system terminal:
curl -LO https://github.com/aws-samples/amazon-sagemaker-spark-ui/releases/download/v0.1.0/amazon-sagemaker-spark-ui-0.1.0.tar.gz
tar -xvzf amazon-sagemaker-spark-ui-0.1.0.tar.gz cd amazon-sagemaker-spark-ui-0.1.0/install-scripts
chmod +x install-history-server.sh
./install-history-server.sh

The commands will take a few seconds to complete.

  1. When the installation is complete, you can start the Spark UI by using the provided sm-spark-cli and access it from a web browser by running the following code:

sm-spark-cli start s3://DOC-EXAMPLE-BUCKET/<SPARK_EVENT_LOGS_LOCATION>

The S3 location where the event logs produced by SageMaker Processing, AWS Glue, or Amazon EMR are stored can be configured when running Spark applications.

For SageMaker Studio notebooks and AWS Glue Interactive Sessions, you can set up the Spark event log location directly from the notebook by using the sparkmagic kernel.

The sparkmagic kernel contains a set of tools for interacting with remote Spark clusters through notebooks. It offers magic (%spark, %sql) commands to run Spark code, perform SQL queries, and configure Spark settings like executor memory and cores.

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.

For the SageMaker Processing job, you can configure the Spark event log location directly from the SageMaker Python SDK.

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.

Refer to the AWS documentation for additional information:

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.

You can choose the generated URL to access the Spark UI.

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.

The following screenshot shows an example of the Spark UI.

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.

You can check the status of the Spark History Server by using the sm-spark-cli status command in the Studio System terminal.

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.

You can also stop the Spark History Server when needed.

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.

Automate the Spark UI installation for users in a SageMaker Studio domain

As an IT admin, you can automate the installation for SageMaker Studio users by using a lifecycle configuration. This can be done for all user profiles under a SageMaker Studio domain or for specific ones. See Customize Amazon SageMaker Studio using Lifecycle Configurations for more details.

You can create a lifecycle configuration from the install-history-server.sh script and attach it to an existing SageMaker Studio domain. The installation is run for all the user profiles in the domain.

From a terminal configured with the AWS Command Line Interface (AWS CLI) and appropriate permissions, run the following commands:

curl -LO https://github.com/aws-samples/amazon-sagemaker-spark-ui/releases/download/v0.1.0/amazon-sagemaker-spark-ui-0.1.0.tar.gz
tar -xvzf amazon-sagemaker-spark-ui-0.1.0.tar.gz cd amazon-sagemaker-spark-ui-0.1.0/install-scripts LCC_CONTENT=`openssl base64 -A -in install-history-server.sh` aws sagemaker create-studio-lifecycle-config --studio-lifecycle-config-name install-spark-ui-on-jupyterserver --studio-lifecycle-config-content $LCC_CONTENT --studio-lifecycle-config-app-type JupyterServer --query 'StudioLifecycleConfigArn' aws sagemaker update-domain --region {YOUR_AWS_REGION} --domain-id {YOUR_STUDIO_DOMAIN_ID} --default-user-settings '{ "JupyterServerAppSettings": { "DefaultResourceSpec": { "LifecycleConfigArn": "arn:aws:sagemaker:{YOUR_AWS_REGION}:{YOUR_STUDIO_DOMAIN_ID}:studio-lifecycle-config/install-spark-ui-on-jupyterserver", "InstanceType": "system" }, "LifecycleConfigArns": [ "arn:aws:sagemaker:{YOUR_AWS_REGION}:{YOUR_STUDIO_DOMAIN_ID}:studio-lifecycle-config/install-spark-ui-on-jupyterserver" ] }}'

After Jupyter Server restarts, the Spark UI and the sm-spark-cli will be available in your SageMaker Studio environment.

Clean up

In this section, we show you how to clean up the Spark UI in a SageMaker Studio domain, either manually or automatically.

Manually uninstall the Spark UI

To manually uninstall the Spark UI in SageMaker Studio, complete the following steps:

  1. Choose System terminal in the SageMaker Studio launcher.

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.

  1. Run the following commands in the system terminal:
cd amazon-sagemaker-spark-ui-0.1.0/install-scripts chmod +x uninstall-history-server.sh
./uninstall-history-server.sh

Uninstall the Spark UI automatically for all SageMaker Studio user profiles

To automatically uninstall the Spark UI in SageMaker Studio for all user profiles, complete the following steps:

  1. On the SageMaker console, choose Domains in the navigation pane, then choose the SageMaker Studio domain.

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.

  1. On the domain details page, navigate to the Environment tab.
  2. Select the lifecycle configuration for the Spark UI on SageMaker Studio.
  3. Choose Detach.

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.

  1. Delete and restart the Jupyter Server apps for the SageMaker Studio user profiles.

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.

Conclusion

In this post, we shared a solution you can use to quickly install the Spark UI on SageMaker Studio. With the Spark UI hosted on SageMaker, machine learning (ML) and data engineering teams can use scalable cloud compute to access and analyze Spark logs from anywhere and speed up their project delivery. IT admins can standardize and expedite the provisioning of the solution in the cloud and avoid proliferation of custom development environments for ML projects.

All the code shown as part of this post is available in the GitHub repository.


About the Authors

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.Giuseppe Angelo Porcelli is a Principal Machine Learning Specialist Solutions Architect for Amazon Web Services. With several years software engineering and an ML background, he works with customers of any size to understand their business and technical needs and design AI and ML solutions that make the best use of the AWS Cloud and the Amazon Machine Learning stack. He has worked on projects in different domains, including MLOps, computer vision, and NLP, involving a broad set of AWS services. In his free time, Giuseppe enjoys playing football.

Host the Spark UI on Amazon SageMaker Studio | Amazon Web Services PlatoBlockchain Data Intelligence. Vertical Search. Ai.Bruno Pistone is an AI/ML Specialist Solutions Architect for AWS based in Milan. He works with customers of any size, helping them understand their technical needs and design AI and ML solutions that make the best use of the AWS Cloud and the Amazon Machine Learning stack. His field of expertice includes machine learning end to end, machine learning endustrialization, and generative AI. He enjoys spending time with his friends and exploring new places, as well as traveling to new destinations.

Time Stamp:

More from AWS Machine Learning