Blog Archives

I used PowerShell !

Our production environment consists of   SQL Server 2008 R2 with several databases across multiple SQL Sever instances. We follow a somewhat old school approach to deployment, wherein once a project is past QA and in the Stage/UAT environment, we no longer create and deploy builds in a cumulative fashion. When bugs are found in Stage/UAT environment, the Builds to fix those bugs (iterative cycle) in Stage/UAT are preserved and deployed sequentially , as-is in Production as well. If we needed 10 iterations (hence 10 builds) to fix a bug in Stage/UAT, we will deploy the same 10 builds to Production sequentially !

The Problem:

This tediously meticulous approach to deployment guarantees the repetition of  the same successful path to deployment in production (in theory), that was taken in Stage/UAT environment. It leads to same quality of code being deployed to production, as was deployed to Stage/UAT and hence is expected to produced the same results (in theory). However, when the number of iterations needed to fix all bugs in Stage/UAT is large enough that we routinely end up with builds running into double digits. Efficiently and accurately deploying 10 plus builds to production, within a relatively short deployment window was starting to become a challenge for us (Our DBA is not only expected to log deployment results, but proceed with next script ONLY upon success of previous script).  While we were not ready to fully automate the execution of our deployment scripts via a batch run , we needed a command line method for deploying our SQL scripts relatively fast , where the execution messages are not only captured in a log file, but also displayed on the screen. This would  not only let our DBA identify if a script’s execution encountered any errors, without having to open up the log file, but also help execute the deployment faster than using a fully manual, SSMS based deployment approach.

The solution:

Our first attempt was using  SQLCMD to achieve a fair degree of automation and speed up the deployment time, by reducing manual work. I have a simple test script here with a few PRINT statements , one simple SELECT statement that executes successfully and another simple SELECT statement that fails due to non-existent table (to simulate a script failure scenario). Do take note that my script uses SQLCMD variable “:on error exit” ,which causes the batch to stop execution upon encountering an error . I have named the script quite creatively as “test.sql”.

USE Demo;
GO
:on error exit
PRINT N'Deploying Demo Script...';
GO
SELECT COUNT(*) FROM [dbo].[demo_order];
GO
PRINT N'Running query against non-existing table...';
GO
SELECT COUNT(*) FROM [dbo].[does_not_exist];
GO
PRINT N'This PRINT should not run as previous query errors and batch should exit...';
GO

When run in SSMS, this script produces the following output, and exits the batch upon encountering the first error as expected ;

Deploying Demo Script...
(1 row(s) affected)
Running query against non-existing table...
Msg 208, Level 16, State 1, Line 2
Invalid object name 'dbo.does_not_exist'.
** An error was encountered during execution of batch. Exiting.

The quickest way to automate the execution of my test script, is to use SQLCMD via the command line. Note the “-b” option used in my SQLCMD command string, which forces the termination of batch upon encountering errors. This is functionally similar to using “:on error exit” SQLCMD variable within the script itself. Here is the simple command line string ;

sqlcmd -S  WKS18176\SANIL_2012 -d Demo -b -i test.sql  -o test.sql.log.txt

When this SQLCMD command string is executed in the command prompt, it created the log file documenting the error message and the fact that batch was terminated .However, note that the command prompt screen shows no indication of success or failure of the script.

sqlcmd

Unless our DBA opens up the log file “test.sql.log.txt” for review, he cannot see the execution and error messages as seen below. (I could use the “type” command on the next line here but we prefer to have a single line command )

Changed database context to 'Demo'.
Deploying Demo Script...
-----------
 12
(1 rows affected)
Running query against non-existing table...
Msg 208, Level 16, State 1, Server WKS18176\SANIL_2012, Line 2
Invalid object name 'dbo.does_not_exist'.

This is where PowerShell came to our rescue. With minor modification to my SQLCMD command itself, and adding a PowerShell cmd-let, we were  able to not only log the execution messages into a file, but also display them on the PowerShell screen, without losing any functionality related to exiting the batch upon error.

sqlcmd -S  WKS18176\SANIL_2012 -d Demo -b -i test.sql  | Tee-Object -file test.sql.log.txt

Here is a screenshot of executing my test script via PowerShell.

powershell

This was my first time using PowerShell and I am impressed how quickly we were able to learn and use it. Over the next few weeks, I am going to take up exploring PowerShell and learn how I can apply it to ease some more of our automation pain points !

References:

Studying for MCSA: SQL Server 2012 – A 90 Day Journey


MCSA

 With only a 9 weeks left until the end of the year, I figured I should start getting serious about meeting some of my goals for 2013. I did successfully complete my first goal for 2013 – speaking at the PASS Summit last week. My second goal for the year is to achieve the MCSA: SQL Server 2012 certification.  Since I am Microsoft Certified (MCTS) on SQL Server 2008, I will be taking the two exams to transition to MCSA SQL Server 2012 – the 70-457 & the 70-458, instead of the usual 3 exams. However, in the absence of any training materials specifically geared towards these transition exams, I am using the standard training kit from Microsoft Press, to augment my real work experience on SQL Server 2012. The Microsoft Press training kit consists of 3 books, 1 each for exam 70-461, 70-462 & 70-463. These training kits are available on Amazon .  I also happen to have a subscription to PluralSight’s training library, which has over 8 hours of training videos for exam 70-461 by christopher Harrison and over 16 hours of training videos for exam 70-462 by Sean McCown .

I am hoping that the combination of these two learning resources will help me prepare for the MCSA: SQL Server 2012 exam in about 90 days ( just over 12 weeks). My training schedule is inspired by Microsoft’s “90 Days to MCSA: SQL Server 2012 Edition” program. I intend to continue writing a weekly blog post that documents my learning and progress towards achieving the MCSA: SQL Server 2012 Certification. Hope to hear a lot of feedback and support from #sqlfamily on this journey !

mcsa_books

I am speaking at the PASS Summit 2013 !

PASS_2013_SpeakingButton_250x250

I am honored and excited to be selected to speak at the PASS Summit 2013 in Charlotte, NC – Oct 15th through 18th ! I will be talking about “Database Unit testing” with Visual Studio.  This session highlights the importance of Unit Testing in the development life cycle of a Database application. Unit testing a Database application is definitely a lot more challenging than unit testing a VB.NET or C# application . Creating a consistent database test environment not only involves database code, but also the data itself. More often than not, due to the time and effort involved in creating a consistent database test environment, Unit Testing database code is rarely given a though upfront during the development. This usually leads to late discovery of bugs, that are expensive to fix as the development life cycle progresses. Visual Studio, with Database projects and more recently with SQL Server Data tools (SSDT), had made unit testing fairly  easy to implement. During the course of this session, we will touch base with the the concepts of Unit Testing and demonstrate the implementation of Unit tests for a Database project and an SSDT project in VSTS 2010 and VSTS 2012 respectively. If you have already implemented Database Unit test projects in VSTS 2010, we will also go through a demo for upgrading them to SSDT.

I have presented this session at several SQL Saturday events, User group meetings and regional conferences, and I am looking forward to bring to this session to the PASS Summit. I look forward to seeing you all at the Summit in October !

Programming with SQL Server: Variables

Understanding the numerous types of variable SQL Server has to offer, and their appropriate usage is one of the cornerstones of developing effective database code. I recently helped a co-worker fix an error message in his code.

-- Error
Msg 137, Level 15, State 2, Line 3
Must declare the scalar variable "@l_INT".

The code snippet looked something like this, which led to a discussion about scope of T-SQL variables and an interesting way to fix this problem.

CREATE TABLE #scope_test1
 (col1 INT);

CREATE TABLE #scope_test2
 (col1 INT);
 GO

DECLARE @l_INT INT = 42;

INSERT INTO #scope_test1 (col1)
 SELECT @l_INT;
 GO

INSERT INTO #scope_test2 (col1)
 SELECT @l_INT;
 GO

T-SQL language supports local variables (their names begin with a single @). The scope of a variable is the range of Transact-SQL statements that can reference the variable. The scope of a variable lasts from the point it is declared until the end of the batch or stored procedure in which it is declared. A T-SQL batch is a group of statements that SQL Server parses as a single unit. A batch can be delimited by the BEGIN..END statements, or a client tool like SQL Server Management Studio (SSMS) can specify the end of a batch with a GO command (you can set any word to be a batch separator in SSMS, but we will leave that discussion for another time). The names of some Transact-SQL system functions begin with two at signs (@@). They are commonly referred to as Global variables. Global variable names begin with a @@ prefix. You do not need to declare them, since the server constantly maintains them. They are system-defined functions not variables and do not have the same behaviors as variables. All the global variables represent information specific to the server or a current user sessions. Some of the commonly used ones are @@ERROR, @@IDENTITY, @@VERSION. Traditionally, DECLARE command is used to declare a local variable and SET or SELECT used to initialize its value. With SQL Server 2012, both of these tasks can be accomplished in a single statement ;

DECLARE @MyCounter INT = 12 ;

The variables we discussed so far are scalar variables, which can hold a single data value of a specific type. A table variable is a special data type that can be used to store a result set for processing at a later time. table is primarily used for temporary storage of a set of rows returned as the result set of a table-valued function. Functions and variables can be declared to be of type table. table variables can be used in functions, stored procedures, and batches. A DECLARE statement (similar to local scalar variables) is used to DECLARE a table variable. While they behave exactly like local variables, with a well defined scope,they can be thought of as being similar to temporary table, but with several limitations.

The SQLCMD utility lets you enter Transact-SQL statements, system procedures, and script files at the command prompt, in Query Editor in SQLCMD mode, in a Windows script file or in an operating system (Cmd.exe) job step of a SQL Server Agent job. This utility uses ODBC to execute Transact-SQL batches. Scripting variables can be used in SQLCMD scripts. Scripting variables enable one script to be used in multiple scenarios. The setvar command is used to define scripting variables. Variables that are defined by using the setvar command are stored internally. Scripting variables should not be confused with environment variables that are defined at the command prompt by using SET.

The scope of a SQLCMD scripting variable can span several batches, which can be used to implement variables that don’t go out of scope even when a batch ends. The example below is a simplified demo of the same ; 

:setvar l_int "42"
CREATE TABLE #scope_test1
 (col1 INT);
CREATE TABLE #scope_test2
 (col1 INT);
 GO
INSERT INTO #scope_test1 (col1)
 SELECT $(l_int);
 GO
INSERT INTO #scope_test2 (col1)
 SELECT $(l_int);
 GO

-- Results
(1 row(s) affected)
(1 row(s) affected)

A review of the types of variables the T-SQL language has to offer, has helped refresh my understanding of their appropriate usage.

References:

Recap of SQL Saturday #145 Nashville

SQL Saturday #145 was my first trip to Nashville and I must say I am impressed with the city. The recommended event hotel was located in a very nice part of the town , a short drive from the Randor Lake State park and the weather on Friday evening was just perfect for a walk in the park.

The Event venue,  David Lipscomb University, was also a short a drive from the recommended hotel and has a very scenic campus. There were plenty of signs and the location was fairly easy to find. The registration desk was well managed and the breakfast was excellent. I picked up my Speaker shirt and went looking for my friends !

I met with Kathi Kellenberger (b|t), Malathi Mahadevan (b|t), Kevin Kline (b|t) , David Klee (b|t) and Rick Morlean (b|t) on the morning of the event. I was particularly excited about my 11:00 AM session on Parameter Sniffing, as I had recently updated all of my demos to run on SQL Server 2012. My session was very well received and I got a great set of attendees who were very much interested in learning about Parameter sniffing. The session slide deck and demo scripts are available for download on SQL Saturday #145 website.

After my session, I spent a good deal of time talking to attendees, other speakers, organizers and sponsors.  The organizers did a great job with putting up a spectacular SQL Saturday and I would like to thank them for the opportunity to speak at this event. This was my last SQL Saturday for the year and i look forward to more SQL Saturdays in 2013 !

Kathi, Rick & Sanil at Joes2Pros booth – SQL Saturday #145, Nashville

Upcoming speaking Engagements

I have two speaking engagements coming up this week, that I am really excited about. I have my very first webinar coming up on 10/11/2012. Pragmatic works is giving me an opportunity to present my session on Parameter Sniffing with their Free Training on the T’s webinar series. You can click here to register for this webinar, scheduled at 11.oo AM EST on 10/11.  Though I have presented this session at several SQL Saturdays in the past, I am looking forward to my very first experience with delivering this session through a webinar!

My second speaking engagement for this week is at SQL Saturday #154 in Nashville, on 10/13/2012. I will be talking about Parameter Sniffing at 11.00 AM in Room 2. You can click here for the full event schedule, and if you haven’t already signed up for this event, you can register here. I have recently updated my demos for this session to run on SQL Server 2012, so I am really looking forward to presenting them. SQL Saturday #145 not only has an exciting session line up on the event day, but also 4 fantastic pre-cons on Friday 10/12. This will also be my first trip to Nashville, the “Music City” and I plan on visiting the Country Music Hall of Fame!

         

SQL Saturday #122 | Louisville, KY

SQL Saturday #122 | Louisville, KY on July 21, 2012 was the 5th SQL Saturday I have attended so far, and my 3rd as a speaker. The St.Louis contingent – Kathi Kellenberger, Kim Tessereau, Mike Lynn, Jay Carter, Cindy Baker and me ! – was especially excited to attend this event, not only because it’s organized by our friend Malathi Mahadevan, but also for a chance to escape the St.Louis heat !

We had almost forgotten about the time zone change when we drove into Louisville at 6 PM Central sharp, only to realize we were an hour late for the speakers’ dinner! The Bristol Bar & Grille was the perfect location for a great speakers’ dinner, and gave us all a chance to relax, network and enjoy some good food (My personal favorite was the Espresso Crème Brûlée).

University of Louisville is a short 10 minute drive from the Marriot Hotel, and thanks to the email notifications with directions, as well as plenty of signs, we had no trouble finding the venue. Thanks to SPEEDPASS, there were no lines at the registration desk and I found they had my favourite Asiago Cheese Bagels for breakfast! My first session for the day was Andy Thiru‘s  “SQL Azure Intro and What’s New” session and it surely exceeded my expectations. I have never had the opportunity to work with SQL Azure so far, and this session gave me the knowledge and tools to get me started on my own. The next session on my list was “What Sequence objects are (and are not)” by Louis Davidson. I used to be an Oracle DBA until a few years ago, and took sequences for granted, until I discovered SQL Server doesn’t have them (until 2012). With their introduction in SQL Server 2012, I took this opportunity to get myself reacquainted with Sequences.

I had some delicious Veggie Wraps and a Cookie for lunch – again, no lines and no waiting! Post lunch, I took a break in the Speakers’ Lounge to review my upcoming session on Parameter Sniffing, where I discovered a cooler full of Ice Cream!  I had to stop myself after two servings and got back to reviewing my slides & checking my demos. A majority of the attendees for my session were quite involved with the topic, giving rise to several discussions and Q&A, thus making my session all the more valuable for everyone in the room. I was really pleased with the generous evaluations and great feedback for my session.

The last session of the day for me was “Bulletproof: Hardening your SQL Server from Attack” by Sarah Barela. As a developer, I take care of hardening my code against SQL Injection, but usually let administrators worry about securing the servers and databases. This session revealed the amount of work administrators (Database, Server as well as Network) put in to secure our servers ! After the last session, it was time for the closing ceremonies and Raffle. The SQL Saturday #122 Team hosted a great event with a full day of valuable SQL learning. I am really thankful to  the SQL Saturday #122 Team for giving me the opportunity to present my session, and the support of all the sponsors to making such events possible.

I am looking forward to see my friends from Louisville again, at SQL Saturday #154 in St.Louis on Sept 15th , the very first SQL Saturday in St.Louis !

Database Unit Testing Made Easy with Visual Studio Team Systems

While brushing up on my knowledge of software testing concepts, I came across quite an amusing definition of testing; To tell somebody that he is wrong is called criticism. To do so officially is called testing” . A programmer usually resents it when a tester finds a defect in his code. We programmers thoroughly unit test our code before handing it off to a tester, because we take pride in developing a bug free application. Some programming languages (C# , VB, ASP.NET) afford themselves to be unit testing easily, because the application is developed within Visual Studio and can readily leverage its unit testing framework.

Visual Studio allows you to create Database projects, and database developers have started embracing it since Visual Studio Team Systems 2008 Database Edition GDR. This offers a robust framework for database developers to identify bugs with their database objects (schemas, stored procedures, functions, etc) by unit testing their database (T-SQL) code, before handing it over to the tester.  Before we jump into the specifics of database unit testing with Visual Studio, the next couple of paragraphs warm us up to the topic by covering a few basic concepts of software testing.

Software testing, undoubtedly plays a important role in the life cycle of most IT Projects. The Goal of any type of software testing is to identify defects to be fixed, so that the product meets requirements and has a deterministic and predictable output. Depending on the testing method employed, testing can be implemented at any time in the development process. Different software development models will focus the test effort at different phases in the development process. Newer development models, such as Agile, often employ test driven development and place an increased portion of the testing in the hands of the developer, before it reaches a formal team of testers.

Software testing methods are traditionally divided into white and black-box testing. These two approaches are used to describe the point of view that a test engineer takes when designing test cases. Unit Testing falls under the category of  white box testing, where the  the tester has access to the internal data structures and algorithms, including the code that implements these. This is in contrast with the black-box testing method, which  treats the software as a “black box”—without any knowledge of internal implementation. A black box tester is usually not a programmer, and  aims to only test the functionality of software according to the applicable requirements. Since the black-box tester has no knowledge on the underlying code, he may find bugs that a programmer misses. However, the same principle can sometimes lead to writing inefficient or incomplete test cases.

Unit Testing is a key component of Test driven development (TDD). Unit Tests are usually written by Developers while they work on the code, to ensure that a specific  of piece of code (Function, Class, Stored procedure, etc) is working as expected. Unit Testing helps to identify defects in the earlier stages of the software development life cycle, where they are cheaper to fix. Unit Testing can prove especially challenging in the world of database development, because of the need for a consistent test environment.

Database Unit Tests are used to establish a baseline state for a database and then to verify any subsequent changes that you make to database objects.  The Unit Testing Framework in Visual Studio (starting with VSTS 2005) helps database developers create, manage and execute Unit Tests for a Database. The Microsoft.VisualStudio.TestTools.UnitTesting namespace supplies classes that provide unit testing support. This namespace contains many attributes that identify test information to the test engine regarding data sources, order of method execution, program management, agent/host information, and deployment data. It also contains custom unit testing exceptions.

You will need the Database Edition GDR of VSTS 2008 or the Ultimate (or Premium) Editions of VSTS 2010 to create, modify and run database unit tests. You can run database unit tests with Professional Edition on VSTS 2010, but cannot create or modify them. Before you can start running database unit tests in VSTS, you must first create a Database Project and then create a test project. The next step is to write sets of Transact-SQL tests that exercise your database objects. Executing these tests in your isolated development environment helps you to verify whether those objects are behaving correctly before you check them in to version control. As changes are made to the database schema, you can use these tests to verify whether the changes have broken existing functionality. A detailed step by step walk through for creating and running database unit tests can be found  here on MSDN . Once Created, a Unit Test Project and of the tests will show up in the Solution Explorer View;

In a typical database unit test, a Transact-SQL test script runs and returns an instance of the ExecutionResult class. The instance of this class contains a DataSet, the execution time, and the rows affected by the script. All of this information is collected during execution of the script. These results can be evaluated within the Transact-SQL script by using the RAISERROR function, or they can be evaluated by using test conditions. Visual Studio Premium provides a set of the following predefined test conditions for you to use;

  • Data Checksum
  • Empty Resultset
  • Execution Time
  • Expected Schema
  • Inconclusive
  • Not Empty Resultset
  • Row Count
  • Scalar Value
You can also create negative unit tests to verify expected failures in a stored procedure. A detailed description of the database unit test conditions can he found  here on MSDN . Your T-SQL test scripts and the number of test conditions you can use, is only limited by your time and test scope constraints !
Visual Studio lets you create test lists to organize unit tests into groups. Test lists are also used to;
  • Run multiple unit tests as a group
  • Run tests as a part of a Build
  • Enforce check-in policy
Two of the most popular ways of organizing a test list is by the level of testing and by functionality;
Visual Studio 11 (currently Beta) comes with a host of unit testing enhancements. Some of the new features include the Unit Test Explorer and Support to Third Party Test Frameworks. More details can be found here on  Peter Provost’s MSDN Blog post .
Visual Studio not only continues to offer a robust framework for implementing database projects, but also makes a compelling case to use it for unit testing database code. It helps us programmers get one step closer to developing a bug free application. Usually, the end result of this undertaking is a rewarding “Test Results” screen with all indicators in green. Visual Studio also allows you to export and save the test results for future reference;
I am presenting a 60 minute session on this topic at the St.Louis Days of Dot Net Conference, Aug 2nd – 4th 2012, where I    will not only cover the basis of unit testing concepts and terminology, but also discuss how unit testing helps ensure and document the quality and accuracy of database deliverables. I will also will run through a demo of creating and running database unit tests using VSTS 2010 . I have spoken on the same topic in the past at the St.Louis SQL Server User Group meeting on June 12th 2012.
References:

SQL Server 2012 – What’s new in Service Broker?

Service Broker has come a long way since its’  introduction in Microsoft SQL Server 2005. SQL Server 2008 brought a host of Service Broker enhancements like conversation priorities, the ssbdiagnose utility, alternate poison message handling and new performance objects & counters. SQL Server 2012 is a major release that comes with an astonishing  array of enhancements and new features, including  new bells & whistles for Service Broker.

A significant new feature for Service Broker is the ability to Multicast messages. This enables a single initiator service to send messages to multiple target services, akin to SQL Server Replication, where multiple subscribers can subscribe to the same publication. The syntax of the send command has been extended to enable multicasting , by allowing multiple conversation handles.

DECLARE @cvh1 UNIQUEIDENTIFIER,
 @cvh2 UNIQUEIDENTIFIER,
 @cvh3 UNIQUEIDENTIFIER,
 @TestMsgBody XML ;

SET @TestMsgBody = '<test>Test Message</test>' ;

BEGIN DIALOG @cvh1
FROM SERVICE [//InitiatorService]
TO SERVICE '//TargetService1'
ON CONTRACT [//TestProcessing] ;

BEGIN DIALOG @cvh2
FROM SERVICE [//InitiatorService]
TO SERVICE '//TargetService2'
ON CONTRACT [//TestProcessing] ;

BEGIN DIALOG @cvh3
FROM SERVICE [//InitiatorService]
TO SERVICE '//TargetService3'
ON CONTRACT [//TestProcessing] ;

SEND ON CONVERSATION (@cvh1, @cvh2, @cvh3)
 MESSAGE TYPE [//TestMgsType]
 (@TestMsgBody) ;

Another new feature is the addition of a new column - message_enqueue_time – to the Queues. This column helps us determine the time spent by a message on the queue.  The message_enqueue_time column is of datetimeoffset data type and stores the date and time (with time zone awareness) when the message arrives on the queue. Its’ exposed to the application via a direct query on the Queue itself, as well as a column in the Receive statement. Let’s look at an example;

-- Begin dialog & send message
DECLARE @ch UNIQUEIDENTIFIER;

BEGIN DIALOG CONVERSATION @ch
 FROM SERVICE [manual_activation_initiator_service]
 TO SERVICE 'manual_activation_target_service'
 ON CONTRACT [manual_activation_contract]
 WITH ENCRYPTION = OFF ;

SEND ON CONVERSATION @ch
 MESSAGE TYPE [manual_activation_demo_request]
 ('<test>test_message_body</test>');
GO

-- A Direct Select statament from the Queue
SELECT
 conversation_handle
 ,CAST ( message_body AS XML) AS [msg_body]
 ,DATEDIFF(second,message_enqueue_time,getutcdate()) AS [time_in_queue(Seconds)]
FROM [manual_activation_target_queue];

-- Results
conversation_handle                  msg_body                        time_in_queue(Seconds)
------------------------------------ ------------------------------- ----------------------
C0CB3200-809F-E111-8B29-005056BE0088 <test>test_message_body</test>  21

(1 row(s) affected)

-- Retrieve the message off of the Queue with a Receive
WAITFOR (
 RECEIVE TOP(1)
 conversation_handle
 ,CAST(message_body AS XML) AS [msg_body]
 ,DATEDIFF(second,message_enqueue_time,getutcdate()) AS [time_in_queue(Seconds)]
FROM manual_activation_target_queue),
TIMEOUT 5;

-- Results
conversation_handle                  msg_body                        time_in_queue(Seconds)
------------------------------------ ------------------------------- ----------------------
C0CB3200-809F-E111-8B29-005056BE0088 <test>test_message_body</test>  21

(1 row(s) affected)

Please note that this column is not well documented yet and to derive the time spent by a message on the queue  in seconds, one must use a few date & time functions.   

 Service Broker Remote Messaging needs some additional configuration in order to get along with the Always On Availability Groups, due to the complexity added by the Listener for an Availability group. The Always On Availability Groups feature  introduced in SQL Server 2012,  is a high availability and disaster recovery solution that provides an enterprise level alternative to database mirroring.

For a Service in an Availability Group to be able to Receive remote messages;

  • The Availability Group must have a listener configured
  • Service Broker Endpoint must be created and configured with the Listener, for every instance of SQL Server in the Availability Group
  • Connect permissions must be granted on the endpoints to the the appropriate login(s)
  • msdb must contain a route

For a Service in an Availability Group to be able to Send messages to a remote service;

  • Configure a route to the target service using the listener
  • msdb must contain a route

And last but not the least, its always worth mentioning that since SQL Server 2008 R2, Poison message handling can be disabled at the time of queue creation or at a later time by using the alter queue statement. You can also check the status of poison message handling for each queue, by querying the sys.service_queues catalog view, which has the column, is_poison_message_handling_enabled to indicate whether poison message is enabled or disabled.

Advanced T-SQL with Itzik Ben-Gan

I recently had the privilege of learning Advanced T-SQL  with Itzik Ben-Gan (Blog|Twitter), a mentor and one of the founders of SolidQ. This 2 day Advanced T-SQL course spanned across a range of interesting topics that a Database Developer would find extremely valuable on a day to day basis ! Not only does Itzik deliver a remarkable lecture, but also encourages questions and discussions, which adds a great deal of value to the learning experience. Some of the topics we covered in detail are;

  • SQL Server Internals & Index Tuning – This was my very first deep dive into SQL Server Internals pertinent to Indexes. The multitude of Index access methods that the Optimizer can choose from, and their impact on Query performance has opened a whole new world of indexing strategies for me. Filtered Indexes and Columnstore Indexes some of the newer features I am looking forward to try out.
  • Temporary Tables – Never before has the distinction between Temporary tables, Table variables and Common Table Expressions (CTEs) been presented with so much clarity  and lucidity. I now have a solid understanding of which to use in any given scenario.
  • Sets vs Cursors – Again, a classic topic. Never hurts to reinforce the advantages of a Set based solution versus cursors.
  • Query Tuning has never been so interesting with a real life example of dramatic performance gains between various revisions of the same query.
  • Apply – I cannot believe I had used the apply operator only for XML Queries so for. I have now learned its uses for unpivoting, aggregating and reusing column aliases.
  • Grouping Sets – I had learned this one while studying for my MCTS certification, but had never got around to making much use of it. This gave me the chance to understand its full potential and applicability.
  • Sequences – I had done some research on this topic before to my blog post about the same, but I found out soon that  there is still so much more to it, especially when comparing it with Identity.
  • The enhancements to Window Functions and the world of Partitioning, Ordering and Framing offers a whole new paradigm for OLAP Queries used in Reporting.

Over the next few weeks, I intend to take the time to absorb all the new concepts and ideas I learned in this 2 day course, and find areas to implement these ideas.

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: