Understanding the numerous types of variable SQL Server has to offer, and their appropriate usage is one of the cornerstones of developing effective database code. I recently helped a co-worker fix an error message in his code.
-- Error Msg 137, Level 15, State 2, Line 3 Must declare the scalar variable "@l_INT".
The code snippet looked something like this, which led to a discussion about scope of T-SQL variables and an interesting way to fix this problem.
CREATE TABLE #scope_test1 (col1 INT); CREATE TABLE #scope_test2 (col1 INT); GO DECLARE @l_INT INT = 42; INSERT INTO #scope_test1 (col1) SELECT @l_INT; GO INSERT INTO #scope_test2 (col1) SELECT @l_INT; GO
T-SQL language supports local variables (their names begin with a single @). The scope of a variable is the range of Transact-SQL statements that can reference the variable. The scope of a variable lasts from the point it is declared until the end of the batch or stored procedure in which it is declared. A T-SQL batch is a group of statements that SQL Server parses as a single unit. A batch can be delimited by the BEGIN..END statements, or a client tool like SQL Server Management Studio (SSMS) can specify the end of a batch with a GO command (you can set any word to be a batch separator in SSMS, but we will leave that discussion for another time). The names of some Transact-SQL system functions begin with two at signs (@@). They are commonly referred to as Global variables. Global variable names begin with a @@ prefix. You do not need to declare them, since the server constantly maintains them. They are system-defined functions not variables and do not have the same behaviors as variables. All the global variables represent information specific to the server or a current user sessions. Some of the commonly used ones are @@ERROR, @@IDENTITY, @@VERSION. Traditionally, DECLARE command is used to declare a local variable and SET or SELECT used to initialize its value. With SQL Server 2012, both of these tasks can be accomplished in a single statement ;
DECLARE @MyCounter INT = 12 ;
The variables we discussed so far are scalar variables, which can hold a single data value of a specific type. A table variable is a special data type that can be used to store a result set for processing at a later time. table is primarily used for temporary storage of a set of rows returned as the result set of a table-valued function. Functions and variables can be declared to be of type table. table variables can be used in functions, stored procedures, and batches. A DECLARE statement (similar to local scalar variables) is used to DECLARE a table variable. While they behave exactly like local variables, with a well defined scope,they can be thought of as being similar to temporary table, but with several limitations.
The SQLCMD utility lets you enter Transact-SQL statements, system procedures, and script files at the command prompt, in Query Editor in SQLCMD mode, in a Windows script file or in an operating system (Cmd.exe) job step of a SQL Server Agent job. This utility uses ODBC to execute Transact-SQL batches. Scripting variables can be used in SQLCMD scripts. Scripting variables enable one script to be used in multiple scenarios. The setvar command is used to define scripting variables. Variables that are defined by using the setvar command are stored internally. Scripting variables should not be confused with environment variables that are defined at the command prompt by using SET.
The scope of a SQLCMD scripting variable can span several batches, which can be used to implement variables that don’t go out of scope even when a batch ends. The example below is a simplified demo of the same ;
:setvar l_int "42" CREATE TABLE #scope_test1 (col1 INT); CREATE TABLE #scope_test2 (col1 INT); GO INSERT INTO #scope_test1 (col1) SELECT $(l_int); GO INSERT INTO #scope_test2 (col1) SELECT $(l_int); GO -- Results (1 row(s) affected) (1 row(s) affected)
A review of the types of variables the T-SQL language has to offer, has helped refresh my understanding of their appropriate usage.
- Table Variables – http://msdn.microsoft.com/en-us/library/ms175010.aspx
- T-SQL Variables – http://msdn.microsoft.com/en-us/library/ms187953(v=sql.105).aspx
- Global Variables – http://www.codeproject.com/Articles/39131/Global-Variables-in-SQL-Server
- Scripting variables with SQLCMD – http://msdn.microsoft.com/en-us/library/ms188714.aspx
- SQLCMD utility - http://msdn.microsoft.com/en-us/library/ms162773.aspx
SQL Saturday #145 was my first trip to Nashville and I must say I am impressed with the city. The recommended event hotel was located in a very nice part of the town , a short drive from the Randor Lake State park and the weather on Friday evening was just perfect for a walk in the park.
The Event venue, David Lipscomb University, was also a short a drive from the recommended hotel and has a very scenic campus. There were plenty of signs and the location was fairly easy to find. The registration desk was well managed and the breakfast was excellent. I picked up my Speaker shirt and went looking for my friends !
I met with Kathi Kellenberger (b|t), Malathi Mahadevan (b|t), Kevin Kline (b|t) , David Klee (b|t) and Rick Morlean (b|t) on the morning of the event. I was particularly excited about my 11:00 AM session on Parameter Sniffing, as I had recently updated all of my demos to run on SQL Server 2012. My session was very well received and I got a great set of attendees who were very much interested in learning about Parameter sniffing. The session slide deck and demo scripts are available for download on SQL Saturday #145 website.
After my session, I spent a good deal of time talking to attendees, other speakers, organizers and sponsors. The organizers did a great job with putting up a spectacular SQL Saturday and I would like to thank them for the opportunity to speak at this event. This was my last SQL Saturday for the year and i look forward to more SQL Saturdays in 2013 !
I have two speaking engagements coming up this week, that I am really excited about. I have my very first webinar coming up on 10/11/2012. Pragmatic works is giving me an opportunity to present my session on Parameter Sniffing with their Free Training on the T’s webinar series. You can click here to register for this webinar, scheduled at 11.oo AM EST on 10/11. Though I have presented this session at several SQL Saturdays in the past, I am looking forward to my very first experience with delivering this session through a webinar!
My second speaking engagement for this week is at SQL Saturday #154 in Nashville, on 10/13/2012. I will be talking about Parameter Sniffing at 11.00 AM in Room 2. You can click here for the full event schedule, and if you haven’t already signed up for this event, you can register here. I have recently updated my demos for this session to run on SQL Server 2012, so I am really looking forward to presenting them. SQL Saturday #145 not only has an exciting session line up on the event day, but also 4 fantastic pre-cons on Friday 10/12. This will also be my first trip to Nashville, the “Music City” and I plan on visiting the Country Music Hall of Fame!
SQL Saturday #122 | Louisville, KY on July 21, 2012 was the 5th SQL Saturday I have attended so far, and my 3rd as a speaker. The St.Louis contingent – Kathi Kellenberger, Kim Tessereau, Mike Lynn, Jay Carter, Cindy Baker and me ! – was especially excited to attend this event, not only because it’s organized by our friend Malathi Mahadevan, but also for a chance to escape the St.Louis heat !
We had almost forgotten about the time zone change when we drove into Louisville at 6 PM Central sharp, only to realize we were an hour late for the speakers’ dinner! The Bristol Bar & Grille was the perfect location for a great speakers’ dinner, and gave us all a chance to relax, network and enjoy some good food (My personal favorite was the Espresso Crème Brûlée).
University of Louisville is a short 10 minute drive from the Marriot Hotel, and thanks to the email notifications with directions, as well as plenty of signs, we had no trouble finding the venue. Thanks to SPEEDPASS, there were no lines at the registration desk and I found they had my favourite Asiago Cheese Bagels for breakfast! My first session for the day was Andy Thiru‘s ”SQL Azure Intro and What’s New” session and it surely exceeded my expectations. I have never had the opportunity to work with SQL Azure so far, and this session gave me the knowledge and tools to get me started on my own. The next session on my list was “What Sequence objects are (and are not)” by Louis Davidson. I used to be an Oracle DBA until a few years ago, and took sequences for granted, until I discovered SQL Server doesn’t have them (until 2012). With their introduction in SQL Server 2012, I took this opportunity to get myself reacquainted with Sequences.
I had some delicious Veggie Wraps and a Cookie for lunch – again, no lines and no waiting! Post lunch, I took a break in the Speakers’ Lounge to review my upcoming session on Parameter Sniffing, where I discovered a cooler full of Ice Cream! I had to stop myself after two servings and got back to reviewing my slides & checking my demos. A majority of the attendees for my session were quite involved with the topic, giving rise to several discussions and Q&A, thus making my session all the more valuable for everyone in the room. I was really pleased with the generous evaluations and great feedback for my session.
The last session of the day for me was “Bulletproof: Hardening your SQL Server from Attack” by Sarah Barela. As a developer, I take care of hardening my code against SQL Injection, but usually let administrators worry about securing the servers and databases. This session revealed the amount of work administrators (Database, Server as well as Network) put in to secure our servers ! After the last session, it was time for the closing ceremonies and Raffle. The SQL Saturday #122 Team hosted a great event with a full day of valuable SQL learning. I am really thankful to the SQL Saturday #122 Team for giving me the opportunity to present my session, and the support of all the sponsors to making such events possible.
I am looking forward to see my friends from Louisville again, at SQL Saturday #154 in St.Louis on Sept 15th , the very first SQL Saturday in St.Louis !
While brushing up on my knowledge of software testing concepts, I came across quite an amusing definition of testing; “To tell somebody that he is wrong is called criticism. To do so officially is called testing” . A programmer usually resents it when a tester finds a defect in his code. We programmers thoroughly unit test our code before handing it off to a tester, because we take pride in developing a bug free application. Some programming languages (C# , VB, ASP.NET) afford themselves to be unit testing easily, because the application is developed within Visual Studio and can readily leverage its unit testing framework.
Visual Studio allows you to create Database projects, and database developers have started embracing it since Visual Studio Team Systems 2008 Database Edition GDR. This offers a robust framework for database developers to identify bugs with their database objects (schemas, stored procedures, functions, etc) by unit testing their database (T-SQL) code, before handing it over to the tester. Before we jump into the specifics of database unit testing with Visual Studio, the next couple of paragraphs warm us up to the topic by covering a few basic concepts of software testing.
Software testing, undoubtedly plays a important role in the life cycle of most IT Projects. The Goal of any type of software testing is to identify defects to be fixed, so that the product meets requirements and has a deterministic and predictable output. Depending on the testing method employed, testing can be implemented at any time in the development process. Different software development models will focus the test effort at different phases in the development process. Newer development models, such as Agile, often employ test driven development and place an increased portion of the testing in the hands of the developer, before it reaches a formal team of testers.
Software testing methods are traditionally divided into white and black-box testing. These two approaches are used to describe the point of view that a test engineer takes when designing test cases. Unit Testing falls under the category of white box testing, where the the tester has access to the internal data structures and algorithms, including the code that implements these. This is in contrast with the black-box testing method, which treats the software as a “black box”—without any knowledge of internal implementation. A black box tester is usually not a programmer, and aims to only test the functionality of software according to the applicable requirements. Since the black-box tester has no knowledge on the underlying code, he may find bugs that a programmer misses. However, the same principle can sometimes lead to writing inefficient or incomplete test cases.
Unit Testing is a key component of Test driven development (TDD). Unit Tests are usually written by Developers while they work on the code, to ensure that a specific of piece of code (Function, Class, Stored procedure, etc) is working as expected. Unit Testing helps to identify defects in the earlier stages of the software development life cycle, where they are cheaper to fix. Unit Testing can prove especially challenging in the world of database development, because of the need for a consistent test environment.
Database Unit Tests are used to establish a baseline state for a database and then to verify any subsequent changes that you make to database objects. The Unit Testing Framework in Visual Studio (starting with VSTS 2005) helps database developers create, manage and execute Unit Tests for a Database. The Microsoft.VisualStudio.TestTools.UnitTesting namespace supplies classes that provide unit testing support. This namespace contains many attributes that identify test information to the test engine regarding data sources, order of method execution, program management, agent/host information, and deployment data. It also contains custom unit testing exceptions.
You will need the Database Edition GDR of VSTS 2008 or the Ultimate (or Premium) Editions of VSTS 2010 to create, modify and run database unit tests. You can run database unit tests with Professional Edition on VSTS 2010, but cannot create or modify them. Before you can start running database unit tests in VSTS, you must first create a Database Project and then create a test project. The next step is to write sets of Transact-SQL tests that exercise your database objects. Executing these tests in your isolated development environment helps you to verify whether those objects are behaving correctly before you check them in to version control. As changes are made to the database schema, you can use these tests to verify whether the changes have broken existing functionality. A detailed step by step walk through for creating and running database unit tests can be found here on MSDN . Once Created, a Unit Test Project and of the tests will show up in the Solution Explorer View;
In a typical database unit test, a Transact-SQL test script runs and returns an instance of the ExecutionResult class. The instance of this class contains a DataSet, the execution time, and the rows affected by the script. All of this information is collected during execution of the script. These results can be evaluated within the Transact-SQL script by using the RAISERROR function, or they can be evaluated by using test conditions. Visual Studio Premium provides a set of the following predefined test conditions for you to use;
- Data Checksum
- Empty Resultset
- Execution Time
- Expected Schema
- Not Empty Resultset
- Row Count
- Scalar Value
- Run multiple unit tests as a group
- Run tests as a part of a Build
- Enforce check-in policy
- Software Testing Principles, Terminology & Definitions - http://en.wikipedia.org/wiki/Software_testing
- Unit Testing Framework in Visual Studio -
- Database Unit Testing with Visual Studio -
- Unit Test Conditions in Database Unit Tests - http://msdn.microsoft.com/en-us/library/aa833423
- Test Lists - http://msdn.microsoft.com/en-us/library/ms182461.aspx
- Visual Studio 11 (Beta) Unit Testing updates - http://blogs.msdn.com/b/visualstudioalm/archive/2012/03/08/what-s-new-in-visual-studio-11-beta-unit-testing.aspx
- Software Testing Jokes - http://softwaretestingfundamentals.com/software-testing-jokes/
Service Broker has come a long way since its’ introduction in Microsoft SQL Server 2005. SQL Server 2008 brought a host of Service Broker enhancements like conversation priorities, the ssbdiagnose utility, alternate poison message handling and new performance objects & counters. SQL Server 2012 is a major release that comes with an astonishing array of enhancements and new features, including new bells & whistles for Service Broker.
A significant new feature for Service Broker is the ability to Multicast messages. This enables a single initiator service to send messages to multiple target services, akin to SQL Server Replication, where multiple subscribers can subscribe to the same publication. The syntax of the send command has been extended to enable multicasting , by allowing multiple conversation handles.
DECLARE @cvh1 UNIQUEIDENTIFIER, @cvh2 UNIQUEIDENTIFIER, @cvh3 UNIQUEIDENTIFIER, @TestMsgBody XML ; SET @TestMsgBody = '<test>Test Message</test>' ; BEGIN DIALOG @cvh1 FROM SERVICE [//InitiatorService] TO SERVICE '//TargetService1' ON CONTRACT [//TestProcessing] ; BEGIN DIALOG @cvh2 FROM SERVICE [//InitiatorService] TO SERVICE '//TargetService2' ON CONTRACT [//TestProcessing] ; BEGIN DIALOG @cvh3 FROM SERVICE [//InitiatorService] TO SERVICE '//TargetService3' ON CONTRACT [//TestProcessing] ; SEND ON CONVERSATION (@cvh1, @cvh2, @cvh3) MESSAGE TYPE [//TestMgsType] (@TestMsgBody) ;
Another new feature is the addition of a new column - message_enqueue_time – to the Queues. This column helps us determine the time spent by a message on the queue. The message_enqueue_time column is of datetimeoffset data type and stores the date and time (with time zone awareness) when the message arrives on the queue. Its’ exposed to the application via a direct query on the Queue itself, as well as a column in the Receive statement. Let’s look at an example;
-- Begin dialog & send message DECLARE @ch UNIQUEIDENTIFIER; BEGIN DIALOG CONVERSATION @ch FROM SERVICE [manual_activation_initiator_service] TO SERVICE 'manual_activation_target_service' ON CONTRACT [manual_activation_contract] WITH ENCRYPTION = OFF ; SEND ON CONVERSATION @ch MESSAGE TYPE [manual_activation_demo_request] ('<test>test_message_body</test>'); GO -- A Direct Select statament from the Queue SELECT conversation_handle ,CAST ( message_body AS XML) AS [msg_body] ,DATEDIFF(second,message_enqueue_time,getutcdate()) AS [time_in_queue(Seconds)] FROM [manual_activation_target_queue]; -- Results conversation_handle msg_body time_in_queue(Seconds) ------------------------------------ ------------------------------- ---------------------- C0CB3200-809F-E111-8B29-005056BE0088 <test>test_message_body</test> 21 (1 row(s) affected) -- Retrieve the message off of the Queue with a Receive WAITFOR ( RECEIVE TOP(1) conversation_handle ,CAST(message_body AS XML) AS [msg_body] ,DATEDIFF(second,message_enqueue_time,getutcdate()) AS [time_in_queue(Seconds)] FROM manual_activation_target_queue), TIMEOUT 5; -- Results conversation_handle msg_body time_in_queue(Seconds) ------------------------------------ ------------------------------- ---------------------- C0CB3200-809F-E111-8B29-005056BE0088 <test>test_message_body</test> 21 (1 row(s) affected)
Please note that this column is not well documented yet and to derive the time spent by a message on the queue in seconds, one must use a few date & time functions.
Service Broker Remote Messaging needs some additional configuration in order to get along with the Always On Availability Groups, due to the complexity added by the Listener for an Availability group. The Always On Availability Groups feature introduced in SQL Server 2012, is a high availability and disaster recovery solution that provides an enterprise level alternative to database mirroring.
For a Service in an Availability Group to be able to Receive remote messages;
- The Availability Group must have a listener configured
- Service Broker Endpoint must be created and configured with the Listener, for every instance of SQL Server in the Availability Group
- Connect permissions must be granted on the endpoints to the the appropriate login(s)
- msdb must contain a route
For a Service in an Availability Group to be able to Send messages to a remote service;
- Configure a route to the target service using the listener
- msdb must contain a route
And last but not the least, its always worth mentioning that since SQL Server 2008 R2, Poison message handling can be disabled at the time of queue creation or at a later time by using the alter queue statement. You can also check the status of poison message handling for each queue, by querying the sys.service_queues catalog view, which has the column, is_poison_message_handling_enabled to indicate whether poison message is enabled or disabled.
I recently had the privilege of learning Advanced T-SQL with Itzik Ben-Gan (Blog|Twitter), a mentor and one of the founders of SolidQ. This 2 day Advanced T-SQL course spanned across a range of interesting topics that a Database Developer would find extremely valuable on a day to day basis ! Not only does Itzik deliver a remarkable lecture, but also encourages questions and discussions, which adds a great deal of value to the learning experience. Some of the topics we covered in detail are;
- SQL Server Internals & Index Tuning – This was my very first deep dive into SQL Server Internals pertinent to Indexes. The multitude of Index access methods that the Optimizer can choose from, and their impact on Query performance has opened a whole new world of indexing strategies for me. Filtered Indexes and Columnstore Indexes some of the newer features I am looking forward to try out.
- Temporary Tables – Never before has the distinction between Temporary tables, Table variables and Common Table Expressions (CTEs) been presented with so much clarity and lucidity. I now have a solid understanding of which to use in any given scenario.
- Sets vs Cursors – Again, a classic topic. Never hurts to reinforce the advantages of a Set based solution versus cursors.
- Query Tuning has never been so interesting with a real life example of dramatic performance gains between various revisions of the same query.
- Apply – I cannot believe I had used the apply operator only for XML Queries so for. I have now learned its uses for unpivoting, aggregating and reusing column aliases.
- Grouping Sets – I had learned this one while studying for my MCTS certification, but had never got around to making much use of it. This gave me the chance to understand its full potential and applicability.
- Sequences – I had done some research on this topic before to my blog post about the same, but I found out soon that there is still so much more to it, especially when comparing it with Identity.
- The enhancements to Window Functions and the world of Partitioning, Ordering and Framing offers a whole new paradigm for OLAP Queries used in Reporting.
Over the next few weeks, I intend to take the time to absorb all the new concepts and ideas I learned in this 2 day course, and find areas to implement these ideas.
SQL Saturday #118 | Madison, WI on Apr 21st 2012, definitely ranks amongst the most rocking SQL Saturdays I have attended ! Jes Borland [blog|twitter] and team produced a top-tier event for both speakers and attendees alike. This being their first SQL Saturday, I feel very fortunate to have been chosen as a speaker and present my session on Service Broker. Taking a 6 hour road trip on Friday evening, I rolled into Madison with the St.Louis contingent (Kim Tessereau [blog|twitter], Mike Lynn [twitter] & Jay Carter). It was too late to join the speakers’ dinner, but we did have an awesome dinner at a Hibachi grill !
On the morning of Apr 21st, we found our way into the Madison Area Technical College and were pleasantly welcomed at the registration desk – with no lines ! The SpeedPASS worked out really well for the event, and they also had a printer at the registration desk to help attendees print one. There were plenty of volunteers helping attendees find their way around the venue, and the breakfast was great. All of this made a great first impression – very well done !
Bill Fellows [blog|twitter] kicked off the event with an awesome session on TSQL sweets in SQL Server 2012, followed by an equally good session on SSIS 2012 new features by Norman Kelm [blog|twitter]. Next up, was my interview with Karla Landrum [blog|twitter], at the PASS booth, for the upcoming SQL Saturday event in St.Louis. It worked out for the best, to have a face to face discussion with Karla, because we ended up talking for over an hour and I learned so much more about the makings of a successful event. We have moved one step closer to hosting the very first SQL Saturday in St.Louis, in the Fall of 2012!
I hosted the Service Broker “Cows of a Spot” lunch table and it turned out to be an exciting opportunity to network with fellow Service Broker enthusiasts. The delicious baked beans, burgers, as well as the the efficient crowd management skills of the volunteers deserve a special mention. The next session on my list was about Filegroups. Jes made an impressive presentation (LEGOs were involved !) about how Filegroups help with Performance and Management of your database. Thanks to her presentation, I have learned the skill of Piecemeal Restore of selected Filegroups .
I took a short break from SQL Learning in the Speakers’ Lounge to brush up on my presentation . I then picked up my shiny new speaker’s shirt and settled down to review my presentation material. After dotting the i’s and crossing the t’s I headed up to my session (last session for the day). I was pleasantly surprised to see a few familiar faces in the audience. This was only the second time I presented my Service Broker session at a SQL Saturday, and I really appreciate all the support and positive feedback I got from the audience ! One of the very useful pieces of information I learned from the Q&A with the audience was the idea of using Extended events to troubleshoot service broker applications, since Microsoft announced deprecation of SQL Server profiler in a later version.
The closing ceremonies followed and I am really happy for all the winners of the raffle prizes from the event’s generous sponsors. After a big round of applause for the speakers, volunteers/organizer and the sponsors, the St.Louis contingent hit the road for the long ride back home.