Categories: DB optimizationT-SQL

Generating a Workload for SQL Server

There are some special scenarios where we require to generate a customized workload in our SQL Server instance, these could be specific T-SQL statements as simple queries or stored procedures.  In some cases, we need to simulate a workload for troubleshooting common errors related to deadlocks or race conditions, even for testing purposes in which we want to analyze the behavior of SQL Server in
presence of some conditions or data distributions.

I have been working in the last years in the optimization and troubleshooting area inside SQL Server Databases, during this time I have faced complex issues in Production, however, before to fix them we must try to reproduce the workload in order to identify the main root cause of problems in a test Environment.  I used to work with two different approaches to achieve it. The first is based on a free tool called SQL Query Stress (https://github.com/ErikEJ/SqlQueryStress), it was originally developed
by Adam Machanic, nowadays it is hosted in GitHub and open to the developer community.

The next screen shows it:

If we click the Database button, we can set up the details about the connection (Server, Database, authentication).  On the internet, you can find some blogs focused in explain it detailed.  Normally,
in my day to day work and for re-use in the future, I normally take a screenshot of this application as part of the approach taken to fix the issue. An interesting option which was recently added to this app is the way of storing a configuration file which avoids us having to parameterize it every time we open it.

I would also like to explain an alternative option which requires a more customized building process, especially when you want to have more content about your workload. The credit about this approach should be given to the amazing guys from SQLSkills (http://www.sqlskills.com). I have had the opportunity of learning this technique in a workshop taught by Erin Stellato. The steps that we should follow are the next:

  1. Build the T-SQL queries or stored procedures that we want to use.
  2. Create a new command file which uses the sqlcmd utility
  3. Create a .bat file which should call the command file previously created
  4. Create a .bat file which allows us to stop the workload generation.

Let me guide you in the process to complete the steps listed above.  In this example I am using the
new DB examples World Wide Importers, below you can find the link to download it:  :

https://github.com/Microsoft/sql-server-samples/releases/tag/wide-world-importers-v1.0

Once you have installed it, you have to open SSMS and type the next T-SQL query which basically gets
all the rows in the PurchasesOrders table, we have to put the statement inside an infinite loop to
bring the continuous execution which behaves as a very busy app that is receiving many concurrent
calls.

USE [WideWorldImporters];
GO

SET NOCOUNT ON;

WHILE 1 = 1
BEGIN

SELECT PurchaseOrderID ,
SupplierID ,
OrderDate ,
DeliveryMethodId ,
[ExpectedDeliveryDate]
FROM [Purchasing].[PurchaseOrders];

END;

We have to save this file as .sql format, I chose the name WWI_Client.sql.  The next step consists in creating a .bat file which internally must have a sentence for using the sqlcmd utility, I will not spend time to talk in deep about sqlcmd but I would like to quote the official description from the Microsoft site:

“The sqlcmd utility lets you enter Transact-SQL statements, system procedures, and script files at the command prompt, in Query Editor in SQLCMD mode, in a Windows script file or in an operating system (Cmd.exe) job step of a SQL Server Agent job. This utility uses ODBC to execute Transact-SQL batches.”

For our case, we are going to use a simple syntax to connect our specific MS SQL Server instance and indicate the execution of the content inside the WWI_Client.sql, we should replace the windows path
with your file path.


sqlcmd -S NameOfYourServerHere -i"C:\YourWindowsPath\WWI_Client.sql"

exit

Dissecting the previous command, the reserved words sqlcmd indicates the use of this utility, the –S parameter allows us to indicate the Server Name, you can take it from your own SSMS as here:

The last parameter –i allows us to point out a specific file which will be used as an input file and sqlcmd will be in charge of interpreting it. It is very important to always test our
T-SQL code in an isolated before to call it from sqlcmd because in case we get any error it would be better to detect it in advance.  The exit instruction avoids having to keep the cmd window open. We
have to save the file in .cmd format, I chose the name RunWWI_Client.cmd.

Finally, the last step is to call the RunWWI_Client.cmd from a new .bat file which should be called a specific number of times, here is where we have to decide how many calls we wish to execute, for this example I only chose 25 calls, remember to type the exit words at the end of the file.

start RunWWI_Client.cmd
start RunWWI_Client.cmd.
--….(23 times more)
exit

The name of this cmd file would be GetWWI_Clients.cmd, for testing this approach we basically double-click on this file. What does it do? Internally, it creates 25 connections which execute the query stored in the WWI_Client.sql, obviously, in this way we can control how heavy and intense is the
desired workload. If you want to check what is happening behind the scene, you can use the DMV sys.dm_os_waiting_tasks for knowing details about wait queue of the tasks that are waiting
on some resources.

Much details about it in this link:  https://bit.ly/2A4QapZ

Finally, the last step consists in creating a .bat file which uses sqlcmd for stopping the running process in our workload, to do it, we have to create a new file and add the next instructions:

sqlcmd -E -S YourServer -Q "DECLARE @KillSessions nvarchar(MAX) = 
(SELECT 'KILL ' + CAST(session_id AS varchar(20)) + ';' FROM sys.dm_exec_sessions 
WHERE login_name = 'YourUserWhichYouNeedToKillSessions' 
FOR XML PATH(''));EXEC(@KillSessions);"
exit

The important aspect in the previous statement is the capability to kill all the sessions opened by the specific user, in this case, you should fill the login_name with the same that you used for generating the workload, I mean the user that you do click in the RunWWI_Client.cmd file. Basically, the –Q parameter includes a T-SQL Dynamic statement which identifies all the sessions associated to login_name through the DMV and stores the composed SQL statement in the @KillSessions variable for later executing it and automatically stops the workload against the SQL Server instance.

Summary

As we can see, there are different approaches to produce a workload, the difference between using SQLQueryStress and sqlcmd is more about our preferences and the level of customization that we are disposed to do, I used to play with SQLQueryStress for testing or reproducing multiple concurrent
calls to evaluate the improvement after (re)writing a new query. On the other hand, when I want to evaluate the response of SQL Server in specific scenarios such as a deadlock, I normally use the
sqlcmd approach.

Note: I do not expect that you agree with me about when using one approach or another, but I would like you to have at least one effective way to do it.  I hope this article has been useful and please don’t hesitate to write me in case that you have questions or doubts.

Happy Querying!!!!

geohernandez

Share
Published by
geohernandez

Recent Posts

Integrating Azure Blob Storage with Snowflake on AWS: A Guide to Storage Integration and Staging Data

This article offers a comprehensive, step-by-step guide for integrating Azure Blob Storage with a Snowflake…

6 days ago

Getting Started with SnowSQL: Connecting to Your Snowflake Account

In this quick guide, we’ll walk through the essential steps to connect to Snowflake using…

3 weeks ago

A new step in my career as a Senior Data Architect

I am thrilled to share that I have embarked on a new professional journey as…

5 months ago

Data Modeling and its relevance in the Cloud Era

Since 2005, I've immersed myself in the dynamic world of data and its modeling. It's…

7 months ago

Formatting our Postgres scripts with pgformatter in DBeaver

Are you a PostgreSQL enthusiast using DBeaver on a Windows Platform? If you find yourself…

11 months ago

Looking back to Kimball’s approach to Data Warehousing

Over time, it's fascinating to witness how certain concepts, approaches, or visions age. Time, in…

12 months ago