SQL Saturday #154 | St.Louis on Sept 15th 2012, was the first SQL Saturday event in St.Louis, I am happy to say that we started off the SQL Saturday series in St.Louis ( yes, there will be one next year as well !) on a great note. We had over 130 attendees, 11 onsite sponsors and 25 technical sessions spread across 5 classrooms.
Center for Application of Information Technology (CAIT) at WashU, generously offered the venue (6 awesome classrooms plus the common areas and facilities) to host this event. The support of sponsors played a huge role in making this event successful. We got local as well as national sponsors, plus plenty of SWAG sponsors as well. Within a month of opening our call for speakers, we had over 55 sessions submitted for this event. They were a great mix of new and experienced speakers, both local and out of town, on topics ranging across Database Administration, Application Development, Business Intelligence and Professional Development. The real challenge was selecting only 25 sessions and the selection committee put a lot of thought into coming up with the session schedule for the event.
Our team took some time to review the event, which will help us make the next St.Louis SQL Saturday even better event. Feedback from attendees, sponsors, speakers and volunteers was crucial for the team to pin-point areas of improvement.
- The duration of the overall event should be constrained from 9 AM to 4 PM, by either cutting down the session duration from 75 to 60 minutes or possibly having 4 sessions per classroom. We are trying to secure more classrooms at the venue to make this feasible, so that the total number of session offerings during the event remains the same.
- We heard great feeback about the location and quality of the recommended event hotel, and we plan to continue the same for next year
- We will definitely invest in bigger and more number of signs on the sidewalks around the venue, to make it easier for attendees to find their way to the parking area, registration desk and locate the facilities in the venue.
- We are also looking at offering a better packaged box lunch and making some healthy meal options available for next year, not only for lunch, but also for breakfast, snacks and beverages.
- The SPEEDPASS worked out very well and we were able to move attendees efficiently through the registration process. We will probably do an online ONLY version of the event schedule for next year, to mitigate any confusion arising out of last minute schedule changes.
- We plan to leverage Twitter and Linkedin for communicating a majority of updates to event attendees , and use emails ONLY for crucial milestones during the days leading up to the event
Our team is always happy to hear feedback from the attendees and we have plenty of new ideas for the next year’s event. We do have some resource constraints, considering the fact that the event is free to attend, and all of the organizers are volunteers. The event sponsorship, availability of speakers, venue, and costs can change from one year to another and our team will strive to continue providing a valuable learning experience at the event. Organizing a SQL Saturday event, represents team work at its best. On that note, I would like to thank everyone involved; the core team, all the volunteers, PASS & community mentors, sponsors, speakers and friends & family for a successful first SQL Saturday in St. Louis.
SQL Saturday #145 was my first trip to Nashville and I must say I am impressed with the city. The recommended event hotel was located in a very nice part of the town , a short drive from the Randor Lake State park and the weather on Friday evening was just perfect for a walk in the park.
The Event venue, David Lipscomb University, was also a short a drive from the recommended hotel and has a very scenic campus. There were plenty of signs and the location was fairly easy to find. The registration desk was well managed and the breakfast was excellent. I picked up my Speaker shirt and went looking for my friends !
I met with Kathi Kellenberger (b|t), Malathi Mahadevan (b|t), Kevin Kline (b|t) , David Klee (b|t) and Rick Morlean (b|t) on the morning of the event. I was particularly excited about my 11:00 AM session on Parameter Sniffing, as I had recently updated all of my demos to run on SQL Server 2012. My session was very well received and I got a great set of attendees who were very much interested in learning about Parameter sniffing. The session slide deck and demo scripts are available for download on SQL Saturday #145 website.
After my session, I spent a good deal of time talking to attendees, other speakers, organizers and sponsors. The organizers did a great job with putting up a spectacular SQL Saturday and I would like to thank them for the opportunity to speak at this event. This was my last SQL Saturday for the year and i look forward to more SQL Saturdays in 2013 !
I have two speaking engagements coming up this week, that I am really excited about. I have my very first webinar coming up on 10/11/2012. Pragmatic works is giving me an opportunity to present my session on Parameter Sniffing with their Free Training on the T’s webinar series. You can click here to register for this webinar, scheduled at 11.oo AM EST on 10/11. Though I have presented this session at several SQL Saturdays in the past, I am looking forward to my very first experience with delivering this session through a webinar!
My second speaking engagement for this week is at SQL Saturday #154 in Nashville, on 10/13/2012. I will be talking about Parameter Sniffing at 11.00 AM in Room 2. You can click here for the full event schedule, and if you haven’t already signed up for this event, you can register here. I have recently updated my demos for this session to run on SQL Server 2012, so I am really looking forward to presenting them. SQL Saturday #145 not only has an exciting session line up on the event day, but also 4 fantastic pre-cons on Friday 10/12. This will also be my first trip to Nashville, the “Music City” and I plan on visiting the Country Music Hall of Fame!
SQL Saturday #154, the first SQL Saturday in St.Louis is almost here (Sept 15) and I wanted to share some details of this exciting upcoming event.
This event will be held Sep 15th, 2012 at the CAIT Campus of Washington University in St.Louis. The Street address of the venue is – 5 Jackson Avenue, Center for Application of Information technology (CAIT) at Washington University in St.Louis, Clayton, MO 63105. Please follow this link for directions to the venue. The venue is only a short walk from the recommended hotel (Crowne Plaza Clayton). There is free parking for the event attendees in the attached parking Garage (Lot #60) on saturday, you do not need any parking pass or permit. The Garage can be accessed via Lee avenue. From this Garage, take the elevators to Level 1 and follow the signs for SQL Saturday #154 Registration desk. The registration desk will be open for attendees from 7.30 AM and the first 100 attendees to come in with a SPEEDPASS will get a free event t-shirt ! There will be coffee and donuts for breakfast (first come first served!). The first session for the day starts at 8.30 AM.
CAIT has generously offered the use of their facilities to host the event and we would like to remind all of our attendees be respectful of their premises. We do want to make sure they will continue supporting us next year. We have 200 registered attendees (unfortunately, there are still 60+ on the waitlist!), 25 awesome sessions , and 16 sponsors (11 onsite) ! This is going to be a big event in a tight space and it would help for all the attendees to familiarize themselves with the classroom locations in the building.
We will be bringing in lunch boxes from Amighetti’s , a St.Louis favorite on the Hill (The Hill has all the awesome Italian restaurants in town). Your lunch tickets (for those who paid for lunch before midnight of Tuesday 9/11/2012) will be available on your SPEEDPASS. You can take your food and beverages (cans and plastic bottles ONLY) into the classrooms to eat. The Gold sponsors will be presenting 30 minute sessions during lunch in each classroom. Please do stop by their booths and to say hi (such events are not possible without their generous support) and drop your raffle tickets (also on your SPEEDPASS) to enter yourself in drawings for awesome prizes.
There will be plenty of volunteers to help out at the event (look for people wearing blue shirts and wearing lanyards that say “STAFF”). After the last session of the day, we will all assemble in and around room #024 for the closing ceremonies – namely a thank you note and raffle prize drawings ! While we are trying our best to make sure we have taken care of everything, this is our first time organizing an event of this size and we would really appreciate all the help and support we can get to ensure a great event.
The after party will be a small informal gathering of the event attendees at Schlafly Bottleworks, in Maplewood, MO 63143. The SQL Saturday #154 team is only suggesting this venue. Attendees are responsible for their own food and beverages at the after party. SQL Saturday #154 organizers will not be responsible for anything at the after party.
On Friday night (sept 14) around 9.30 PM, we are planning to having a pre-event party in the form of SQL Karaoke. Mike Talayna’s Juke Box Restaurant (1115 Hampton Avenue St. Louis, MO 63139 (314) 647-7600) starts their karaoke at 9.30 PM. All speakers and attendees are welcome to attend (look for kathi Kellenbeger at the event to find the SQL people). The SQL Saturday #154 team is only suggesting this venue. Attendees are responsible for their own food and beverages at this party. SQL Saturday #154 organizers will not be responsible for anything at the party.
We strive to ensure all of you will have a great time at SQL Saturday #154. Please do make the best of the awesome free opportunity to learn and network.
SQL Saturday #122 | Louisville, KY on July 21, 2012 was the 5th SQL Saturday I have attended so far, and my 3rd as a speaker. The St.Louis contingent – Kathi Kellenberger, Kim Tessereau, Mike Lynn, Jay Carter, Cindy Baker and me ! – was especially excited to attend this event, not only because it’s organized by our friend Malathi Mahadevan, but also for a chance to escape the St.Louis heat !
We had almost forgotten about the time zone change when we drove into Louisville at 6 PM Central sharp, only to realize we were an hour late for the speakers’ dinner! The Bristol Bar & Grille was the perfect location for a great speakers’ dinner, and gave us all a chance to relax, network and enjoy some good food (My personal favorite was the Espresso Crème Brûlée).
University of Louisville is a short 10 minute drive from the Marriot Hotel, and thanks to the email notifications with directions, as well as plenty of signs, we had no trouble finding the venue. Thanks to SPEEDPASS, there were no lines at the registration desk and I found they had my favourite Asiago Cheese Bagels for breakfast! My first session for the day was Andy Thiru‘s “SQL Azure Intro and What’s New” session and it surely exceeded my expectations. I have never had the opportunity to work with SQL Azure so far, and this session gave me the knowledge and tools to get me started on my own. The next session on my list was “What Sequence objects are (and are not)” by Louis Davidson. I used to be an Oracle DBA until a few years ago, and took sequences for granted, until I discovered SQL Server doesn’t have them (until 2012). With their introduction in SQL Server 2012, I took this opportunity to get myself reacquainted with Sequences.
I had some delicious Veggie Wraps and a Cookie for lunch – again, no lines and no waiting! Post lunch, I took a break in the Speakers’ Lounge to review my upcoming session on Parameter Sniffing, where I discovered a cooler full of Ice Cream! I had to stop myself after two servings and got back to reviewing my slides & checking my demos. A majority of the attendees for my session were quite involved with the topic, giving rise to several discussions and Q&A, thus making my session all the more valuable for everyone in the room. I was really pleased with the generous evaluations and great feedback for my session.
The last session of the day for me was “Bulletproof: Hardening your SQL Server from Attack” by Sarah Barela. As a developer, I take care of hardening my code against SQL Injection, but usually let administrators worry about securing the servers and databases. This session revealed the amount of work administrators (Database, Server as well as Network) put in to secure our servers ! After the last session, it was time for the closing ceremonies and Raffle. The SQL Saturday #122 Team hosted a great event with a full day of valuable SQL learning. I am really thankful to the SQL Saturday #122 Team for giving me the opportunity to present my session, and the support of all the sponsors to making such events possible.
I am looking forward to see my friends from Louisville again, at SQL Saturday #154 in St.Louis on Sept 15th , the very first SQL Saturday in St.Louis !
While brushing up on my knowledge of software testing concepts, I came across quite an amusing definition of testing; “To tell somebody that he is wrong is called criticism. To do so officially is called testing” . A programmer usually resents it when a tester finds a defect in his code. We programmers thoroughly unit test our code before handing it off to a tester, because we take pride in developing a bug free application. Some programming languages (C# , VB, ASP.NET) afford themselves to be unit testing easily, because the application is developed within Visual Studio and can readily leverage its unit testing framework.
Visual Studio allows you to create Database projects, and database developers have started embracing it since Visual Studio Team Systems 2008 Database Edition GDR. This offers a robust framework for database developers to identify bugs with their database objects (schemas, stored procedures, functions, etc) by unit testing their database (T-SQL) code, before handing it over to the tester. Before we jump into the specifics of database unit testing with Visual Studio, the next couple of paragraphs warm us up to the topic by covering a few basic concepts of software testing.
Software testing, undoubtedly plays a important role in the life cycle of most IT Projects. The Goal of any type of software testing is to identify defects to be fixed, so that the product meets requirements and has a deterministic and predictable output. Depending on the testing method employed, testing can be implemented at any time in the development process. Different software development models will focus the test effort at different phases in the development process. Newer development models, such as Agile, often employ test driven development and place an increased portion of the testing in the hands of the developer, before it reaches a formal team of testers.
Software testing methods are traditionally divided into white and black-box testing. These two approaches are used to describe the point of view that a test engineer takes when designing test cases. Unit Testing falls under the category of white box testing, where the the tester has access to the internal data structures and algorithms, including the code that implements these. This is in contrast with the black-box testing method, which treats the software as a “black box”—without any knowledge of internal implementation. A black box tester is usually not a programmer, and aims to only test the functionality of software according to the applicable requirements. Since the black-box tester has no knowledge on the underlying code, he may find bugs that a programmer misses. However, the same principle can sometimes lead to writing inefficient or incomplete test cases.
Unit Testing is a key component of Test driven development (TDD). Unit Tests are usually written by Developers while they work on the code, to ensure that a specific of piece of code (Function, Class, Stored procedure, etc) is working as expected. Unit Testing helps to identify defects in the earlier stages of the software development life cycle, where they are cheaper to fix. Unit Testing can prove especially challenging in the world of database development, because of the need for a consistent test environment.
Database Unit Tests are used to establish a baseline state for a database and then to verify any subsequent changes that you make to database objects. The Unit Testing Framework in Visual Studio (starting with VSTS 2005) helps database developers create, manage and execute Unit Tests for a Database. The Microsoft.VisualStudio.TestTools.UnitTesting namespace supplies classes that provide unit testing support. This namespace contains many attributes that identify test information to the test engine regarding data sources, order of method execution, program management, agent/host information, and deployment data. It also contains custom unit testing exceptions.
You will need the Database Edition GDR of VSTS 2008 or the Ultimate (or Premium) Editions of VSTS 2010 to create, modify and run database unit tests. You can run database unit tests with Professional Edition on VSTS 2010, but cannot create or modify them. Before you can start running database unit tests in VSTS, you must first create a Database Project and then create a test project. The next step is to write sets of Transact-SQL tests that exercise your database objects. Executing these tests in your isolated development environment helps you to verify whether those objects are behaving correctly before you check them in to version control. As changes are made to the database schema, you can use these tests to verify whether the changes have broken existing functionality. A detailed step by step walk through for creating and running database unit tests can be found here on MSDN . Once Created, a Unit Test Project and of the tests will show up in the Solution Explorer View;
In a typical database unit test, a Transact-SQL test script runs and returns an instance of the ExecutionResult class. The instance of this class contains a DataSet, the execution time, and the rows affected by the script. All of this information is collected during execution of the script. These results can be evaluated within the Transact-SQL script by using the RAISERROR function, or they can be evaluated by using test conditions. Visual Studio Premium provides a set of the following predefined test conditions for you to use;
- Data Checksum
- Empty Resultset
- Execution Time
- Expected Schema
- Not Empty Resultset
- Row Count
- Scalar Value
- Run multiple unit tests as a group
- Run tests as a part of a Build
- Enforce check-in policy
- Software Testing Principles, Terminology & Definitions - http://en.wikipedia.org/wiki/Software_testing
- Unit Testing Framework in Visual Studio -
- Database Unit Testing with Visual Studio -
- Unit Test Conditions in Database Unit Tests - http://msdn.microsoft.com/en-us/library/aa833423
- Test Lists - http://msdn.microsoft.com/en-us/library/ms182461.aspx
- Visual Studio 11 (Beta) Unit Testing updates - http://blogs.msdn.com/b/visualstudioalm/archive/2012/03/08/what-s-new-in-visual-studio-11-beta-unit-testing.aspx
- Software Testing Jokes - http://softwaretestingfundamentals.com/software-testing-jokes/
Service Broker has come a long way since its’ introduction in Microsoft SQL Server 2005. SQL Server 2008 brought a host of Service Broker enhancements like conversation priorities, the ssbdiagnose utility, alternate poison message handling and new performance objects & counters. SQL Server 2012 is a major release that comes with an astonishing array of enhancements and new features, including new bells & whistles for Service Broker.
A significant new feature for Service Broker is the ability to Multicast messages. This enables a single initiator service to send messages to multiple target services, akin to SQL Server Replication, where multiple subscribers can subscribe to the same publication. The syntax of the send command has been extended to enable multicasting , by allowing multiple conversation handles.
DECLARE @cvh1 UNIQUEIDENTIFIER, @cvh2 UNIQUEIDENTIFIER, @cvh3 UNIQUEIDENTIFIER, @TestMsgBody XML ; SET @TestMsgBody = '<test>Test Message</test>' ; BEGIN DIALOG @cvh1 FROM SERVICE [//InitiatorService] TO SERVICE '//TargetService1' ON CONTRACT [//TestProcessing] ; BEGIN DIALOG @cvh2 FROM SERVICE [//InitiatorService] TO SERVICE '//TargetService2' ON CONTRACT [//TestProcessing] ; BEGIN DIALOG @cvh3 FROM SERVICE [//InitiatorService] TO SERVICE '//TargetService3' ON CONTRACT [//TestProcessing] ; SEND ON CONVERSATION (@cvh1, @cvh2, @cvh3) MESSAGE TYPE [//TestMgsType] (@TestMsgBody) ;
Another new feature is the addition of a new column - message_enqueue_time – to the Queues. This column helps us determine the time spent by a message on the queue. The message_enqueue_time column is of datetimeoffset data type and stores the date and time (with time zone awareness) when the message arrives on the queue. Its’ exposed to the application via a direct query on the Queue itself, as well as a column in the Receive statement. Let’s look at an example;
-- Begin dialog & send message DECLARE @ch UNIQUEIDENTIFIER; BEGIN DIALOG CONVERSATION @ch FROM SERVICE [manual_activation_initiator_service] TO SERVICE 'manual_activation_target_service' ON CONTRACT [manual_activation_contract] WITH ENCRYPTION = OFF ; SEND ON CONVERSATION @ch MESSAGE TYPE [manual_activation_demo_request] ('<test>test_message_body</test>'); GO -- A Direct Select statament from the Queue SELECT conversation_handle ,CAST ( message_body AS XML) AS [msg_body] ,DATEDIFF(second,message_enqueue_time,getutcdate()) AS [time_in_queue(Seconds)] FROM [manual_activation_target_queue]; -- Results conversation_handle msg_body time_in_queue(Seconds) ------------------------------------ ------------------------------- ---------------------- C0CB3200-809F-E111-8B29-005056BE0088 <test>test_message_body</test> 21 (1 row(s) affected) -- Retrieve the message off of the Queue with a Receive WAITFOR ( RECEIVE TOP(1) conversation_handle ,CAST(message_body AS XML) AS [msg_body] ,DATEDIFF(second,message_enqueue_time,getutcdate()) AS [time_in_queue(Seconds)] FROM manual_activation_target_queue), TIMEOUT 5; -- Results conversation_handle msg_body time_in_queue(Seconds) ------------------------------------ ------------------------------- ---------------------- C0CB3200-809F-E111-8B29-005056BE0088 <test>test_message_body</test> 21 (1 row(s) affected)
Please note that this column is not well documented yet and to derive the time spent by a message on the queue in seconds, one must use a few date & time functions.
Service Broker Remote Messaging needs some additional configuration in order to get along with the Always On Availability Groups, due to the complexity added by the Listener for an Availability group. The Always On Availability Groups feature introduced in SQL Server 2012, is a high availability and disaster recovery solution that provides an enterprise level alternative to database mirroring.
For a Service in an Availability Group to be able to Receive remote messages;
- The Availability Group must have a listener configured
- Service Broker Endpoint must be created and configured with the Listener, for every instance of SQL Server in the Availability Group
- Connect permissions must be granted on the endpoints to the the appropriate login(s)
- msdb must contain a route
For a Service in an Availability Group to be able to Send messages to a remote service;
- Configure a route to the target service using the listener
- msdb must contain a route
And last but not the least, its always worth mentioning that since SQL Server 2008 R2, Poison message handling can be disabled at the time of queue creation or at a later time by using the alter queue statement. You can also check the status of poison message handling for each queue, by querying the sys.service_queues catalog view, which has the column, is_poison_message_handling_enabled to indicate whether poison message is enabled or disabled.
I recently had the privilege of learning Advanced T-SQL with Itzik Ben-Gan (Blog|Twitter), a mentor and one of the founders of SolidQ. This 2 day Advanced T-SQL course spanned across a range of interesting topics that a Database Developer would find extremely valuable on a day to day basis ! Not only does Itzik deliver a remarkable lecture, but also encourages questions and discussions, which adds a great deal of value to the learning experience. Some of the topics we covered in detail are;
- SQL Server Internals & Index Tuning – This was my very first deep dive into SQL Server Internals pertinent to Indexes. The multitude of Index access methods that the Optimizer can choose from, and their impact on Query performance has opened a whole new world of indexing strategies for me. Filtered Indexes and Columnstore Indexes some of the newer features I am looking forward to try out.
- Temporary Tables – Never before has the distinction between Temporary tables, Table variables and Common Table Expressions (CTEs) been presented with so much clarity and lucidity. I now have a solid understanding of which to use in any given scenario.
- Sets vs Cursors – Again, a classic topic. Never hurts to reinforce the advantages of a Set based solution versus cursors.
- Query Tuning has never been so interesting with a real life example of dramatic performance gains between various revisions of the same query.
- Apply – I cannot believe I had used the apply operator only for XML Queries so for. I have now learned its uses for unpivoting, aggregating and reusing column aliases.
- Grouping Sets – I had learned this one while studying for my MCTS certification, but had never got around to making much use of it. This gave me the chance to understand its full potential and applicability.
- Sequences – I had done some research on this topic before to my blog post about the same, but I found out soon that there is still so much more to it, especially when comparing it with Identity.
- The enhancements to Window Functions and the world of Partitioning, Ordering and Framing offers a whole new paradigm for OLAP Queries used in Reporting.
Over the next few weeks, I intend to take the time to absorb all the new concepts and ideas I learned in this 2 day course, and find areas to implement these ideas.
SQL Saturday #118 | Madison, WI on Apr 21st 2012, definitely ranks amongst the most rocking SQL Saturdays I have attended ! Jes Borland [blog|twitter] and team produced a top-tier event for both speakers and attendees alike. This being their first SQL Saturday, I feel very fortunate to have been chosen as a speaker and present my session on Service Broker. Taking a 6 hour road trip on Friday evening, I rolled into Madison with the St.Louis contingent (Kim Tessereau [blog|twitter], Mike Lynn [twitter] & Jay Carter). It was too late to join the speakers’ dinner, but we did have an awesome dinner at a Hibachi grill !
On the morning of Apr 21st, we found our way into the Madison Area Technical College and were pleasantly welcomed at the registration desk – with no lines ! The SpeedPASS worked out really well for the event, and they also had a printer at the registration desk to help attendees print one. There were plenty of volunteers helping attendees find their way around the venue, and the breakfast was great. All of this made a great first impression – very well done !
Bill Fellows [blog|twitter] kicked off the event with an awesome session on TSQL sweets in SQL Server 2012, followed by an equally good session on SSIS 2012 new features by Norman Kelm [blog|twitter]. Next up, was my interview with Karla Landrum [blog|twitter], at the PASS booth, for the upcoming SQL Saturday event in St.Louis. It worked out for the best, to have a face to face discussion with Karla, because we ended up talking for over an hour and I learned so much more about the makings of a successful event. We have moved one step closer to hosting the very first SQL Saturday in St.Louis, in the Fall of 2012!
I hosted the Service Broker “Cows of a Spot” lunch table and it turned out to be an exciting opportunity to network with fellow Service Broker enthusiasts. The delicious baked beans, burgers, as well as the the efficient crowd management skills of the volunteers deserve a special mention. The next session on my list was about Filegroups. Jes made an impressive presentation (LEGOs were involved !) about how Filegroups help with Performance and Management of your database. Thanks to her presentation, I have learned the skill of Piecemeal Restore of selected Filegroups .
I took a short break from SQL Learning in the Speakers’ Lounge to brush up on my presentation . I then picked up my shiny new speaker’s shirt and settled down to review my presentation material. After dotting the i’s and crossing the t’s I headed up to my session (last session for the day). I was pleasantly surprised to see a few familiar faces in the audience. This was only the second time I presented my Service Broker session at a SQL Saturday, and I really appreciate all the support and positive feedback I got from the audience ! One of the very useful pieces of information I learned from the Q&A with the audience was the idea of using Extended events to troubleshoot service broker applications, since Microsoft announced deprecation of SQL Server profiler in a later version.
The closing ceremonies followed and I am really happy for all the winners of the raffle prizes from the event’s generous sponsors. After a big round of applause for the speakers, volunteers/organizer and the sponsors, the St.Louis contingent hit the road for the long ride back home.
Having worked with Oracle databases during some portion of the last decade, I was introduced to the Sequence Object. Since then, I had been especially missing this feature in SQL Server. But my wait is over with its introduction in the release of SQL Server 2012!
A sequence is a user defined, schema bound object that will generate a sequence of numeric values (in ascending or descending order) according to specification. Unlike identity columns, a sequence is created independent of any table.
Before we explore the sequence object in more detail, here is a brief summary of the difference between identity and sequence.
Sequence is Table independent while Identity is Table Specific
Unlike Identity, you can obtain the new value from a Sequence before using it in a SQL Statement
New value for a Sequence can be generated for both UPDATE as well as INSERT statements
You can define the minimum & maximum values as well as cycling & cache size options for a Sequence
Unlike Identity, a sequence object will not auto increment without any reference to it
You can create a sequence by using the T-SQL Create Sequence statement or in SQL Server Management Studio by navigating through the Object Explorer to Your Database > Programmability > Sequences, Right click and Create New Sequence.
Let’s look at a simple T-SQL statement for creating a basic sequence;
CREATE SEQUENCE dbo.demo_2012_sequence AS INT START WITH 1 INCREMENT BY 1 MINVALUE 1 MAXVALUE 10000 CACHE 20 CYCLE;
Some of the interesting points to note here are;
- The MINVALUE and MAXVALUE should fall within the range of the data type of the sequence object
- The INCREMENT BY can be positive (makes an ascending sequence) or negative (makes a descending sequence), but not Zero (0)
- CYCLE – Specifies if the sequence should restart when the maximum value ( minimum value for a descending sequence) has been reached
- CACHE – Can improve performance by pre-allocating the number of sequence numbers specified, thus minimizing disk IOs. Note that an unexpected shutdown may result in the loss of values remaining in cache.
Once this sequence object has been created, you can use the Next Value For function, to generate a sequence number from it. It is a non-deterministic function and can be used in stored procedures and triggers. When using it in a query, please be aware of the limitations and restrictions on its usage; the most notable one being the restriction on using it in queries that contain SET Operators like UNION, UNION ALL, etc. It can also be used in a Default Constraint as well as an OVER ORDER BY Clause. A complete list of the restrictions and limitations on its usage is available here.
You can get one or more values out of the sequence object. You can also ALTER the sequence object to RESTART it, or change many of its properties. Let’s look at some examples.
- Example 1 – Get one value out of the Sequence Object at a time
-- get one value out of the sequence object at one time SELECT NEXT VALUE FOR dbo.demo_2012_sequence AS seq_no; GO SELECT NEXT VALUE FOR dbo.demo_2012_sequence AS next_seq_no; GO
And the results look like:
seq_no ----------- 1 (1 row(s) affected) next_seq_no ----------- 2 (1 row(s) affected)
- Example 2- Restart the sequence and get multiple values out of the sequence object at one time
-- creating a table and populate it with 5 rows CREATE TABLE #demo_sequence (demo_name VARCHAR(12)); INSERT INTO #demo_sequence VALUES ('row_1'), ('row_2'), ('row_3'), ('row_4'), ('row_5'); GO -- Restart the sequence from 1 ALTER SEQUENCE dbo.demo_2012_sequence RESTART WITH 1 INCREMENT BY 1; GO -- get 5 values out of the sequence object at 1 time SELECT NEXT VALUE FOR dbo.demo_2012_sequence AS seq_no, demo_name FROM #demo_sequence; GO
And the results look like:
seq_no demo_name ----------- ------------ 1 row_1 2 row_2 3 row_3 4 row_4 5 row_5 (5 row(s) affected)
You can also use the system object sys.sp_sequence_get_range to generate and return a number of values, along with some of the metadata related to the range.
The sys.sequences view can also be used to query metadata for a sequence object. It contains one row for each sequence object in the database.
SELECT current_value, cache_size, is_cached, create_date, is_exhausted FROM sys.sequences WHERE name = 'demo_2012_sequence'; GO
And the results look like:
current_value cache_size is_cached is_exhausted --------------- ----------- ----------- ------------ 5 20 1 0 (1 row(s) affected)
In conclusion, the new Sequence Object can provide a viable alternative to Identity, and is definitely worth exploring for your next project.