Published on

December 7, 2020

Understanding Azure Data Lake Analytics in SQL Server

In today’s data-driven world, organizations are dealing with massive amounts of data generated from various sources. Storing and processing this data efficiently is crucial for businesses to gain valuable insights. Azure Data Lake Analytics, offered by Microsoft in the Azure cloud, is a powerful capability that allows organizations to process and analyze large volumes of data in a cost-effective and efficient manner.

What is Azure Data Lake Analytics?

Azure Data Lake Analytics is a fully serverless service provided by Microsoft. It eliminates the need to create infrastructure instances or clusters, making it easy to use and cost-effective. With Azure Data Lake Analytics, you can submit jobs to process and analyze massive amounts of data in a massively parallel processing fashion.

The service supports U-SQL, a native programming language that combines SQL and C# syntax. It also supports extensions for other programming languages like .NET, R, and Python.

Getting Started with Azure Data Lake Analytics

To get started with Azure Data Lake Analytics, you need to create an Azure Data Lake Analytics account. Here are the steps:

  1. Login to your Azure account and navigate to the Azure portal.
  2. Click on “All Services” and select “Analytics” and then “Data Lake Analytics”.
  3. Click on the “Create data lake analytics” button to start the account creation wizard.
  4. Provide a relevant name for the account and select the region and storage subscription.
  5. Choose the pricing package that suits your needs, either pay-as-you-go or monthly commitment.
  6. Review the configuration and click on “Create” to create the Data Lake Analytics account.

Once the account is created, you can explore the various options available in the dashboard view. The menu bar provides options to create new jobs, use sample scripts, open the Azure data explorer, view all jobs, add users, and delete the account.

Creating and Executing Jobs in Azure Data Lake Analytics

Now that you have created an Azure Data Lake Analytics account, you can start running jobs to process and analyze your data. Here are the steps to create and execute a job:

  1. Click on the “New Job” button in the dashboard to open the job editor console.
  2. Provide a job name and specify the Analytics Units (AU) that the job will use. The cost of 1 AU is $2 per hour.
  3. Specify the logic of the job using U-SQL. For example, you can select data and write the output to a file.
  4. Submit the job and monitor its execution in the job execution view.
  5. Once the job completes, you can view the details, including the cost, AU consumed, and the output file generated.

By creating an Azure Data Lake Analytics account and running a job, you can quickly gain a practical understanding of how to process and analyze data using this powerful service.

Conclusion

Azure Data Lake Analytics is a valuable capability offered by Microsoft in the Azure cloud. It allows organizations to process and analyze massive amounts of data in a cost-effective and efficient manner. By understanding the basics of Azure Data Lake Analytics and running jobs, you can unlock the full potential of your data and gain valuable insights for your business.

Click to rate this post!
[Total: 0 Average: 0]

Let's work together

Send us a message or book free introductory meeting with us using button below.