I recently had an opportunity to work with MySQL for the first time, while volunteering to code for a charitable cause. Its been a year or so since I have worked with anything but Microsoft SQL Server, so it was interesting to revisit the challenges of working with an open source database.
At first, I started searching the internet for a “SQL Server Management Studio” like client tool to work with MySQL. This led to the “MySQL WorkBench” – which was easy to download & install and quite intuitive to learn. I was impressed with the built in data modeling features and familiar interfaces that allowed me to browse through database objects and run queries. However, after some more exploring around, I realized that in order to connect to a remote MySQL Database, my user/login must be set up to allow access from a client machine (there are options to allow a user/login to access a MySQL Database from any client machine).
So I went back to Google searching for answers, and came across “phpMyAdmin“, which did look quite promising. Its installation document however seemed much more complicated and needed several pre-requisites , including an Apache tomcat server. Drawing upon my memory trying to learn Hadoop, I figured there might be a nicely packaged distribution which bundles together all of the components I need for MySQL , similar to Cloudera or HortonWorks’s distribution of Hadoop. I was happy to discover “Xampp” – an easy to install Apache distribution containing MySQL, PHP & Perl and includes “phpMyAdmin”. However, this still left me unsure if it would solve my problem of trying to access a remote MySQL Database to create tables and stored procedures.
I got my breakthrough while talking to other volunteer coders on the project and got access to “cPanel” – a graphical web-based control panel that simplifies website and Server (including MySQL Databases) management. Logging into cPanel, I was glad to see a section for “Databases” that included the tools like “MySQL Database Wizard” & “phpMyAdmin”.
In the next part of this blog, I plan to write about creating a MySQL Database, adding users and creating database objects.
Our production environment consists of SQL Server 2008 R2 with several databases across multiple SQL Sever instances. We follow a somewhat old school approach to deployment, wherein once a project is past QA and in the Stage/UAT environment, we no longer create and deploy builds in a cumulative fashion. When bugs are found in Stage/UAT environment, the Builds to fix those bugs (iterative cycle) in Stage/UAT are preserved and deployed sequentially , as-is in Production as well. If we needed 10 iterations (hence 10 builds) to fix a bug in Stage/UAT, we will deploy the same 10 builds to Production sequentially !
This tediously meticulous approach to deployment guarantees the repetition of the same successful path to deployment in production (in theory), that was taken in Stage/UAT environment. It leads to same quality of code being deployed to production, as was deployed to Stage/UAT and hence is expected to produced the same results (in theory). However, when the number of iterations needed to fix all bugs in Stage/UAT is large enough that we routinely end up with builds running into double digits. Efficiently and accurately deploying 10 plus builds to production, within a relatively short deployment window was starting to become a challenge for us (Our DBA is not only expected to log deployment results, but proceed with next script ONLY upon success of previous script). While we were not ready to fully automate the execution of our deployment scripts via a batch run , we needed a command line method for deploying our SQL scripts relatively fast , where the execution messages are not only captured in a log file, but also displayed on the screen. This would not only let our DBA identify if a script’s execution encountered any errors, without having to open up the log file, but also help execute the deployment faster than using a fully manual, SSMS based deployment approach.
Our first attempt was using SQLCMD to achieve a fair degree of automation and speed up the deployment time, by reducing manual work. I have a simple test script here with a few PRINT statements , one simple SELECT statement that executes successfully and another simple SELECT statement that fails due to non-existent table (to simulate a script failure scenario). Do take note that my script uses SQLCMD variable “:on error exit” ,which causes the batch to stop execution upon encountering an error . I have named the script quite creatively as “test.sql”.
USE Demo; GO :on error exit PRINT N'Deploying Demo Script...'; GO SELECT COUNT(*) FROM [dbo].[demo_order]; GO PRINT N'Running query against non-existing table...'; GO SELECT COUNT(*) FROM [dbo].[does_not_exist]; GO PRINT N'This PRINT should not run as previous query errors and batch should exit...'; GO
When run in SSMS, this script produces the following output, and exits the batch upon encountering the first error as expected ;
Deploying Demo Script... (1 row(s) affected) Running query against non-existing table... Msg 208, Level 16, State 1, Line 2 Invalid object name 'dbo.does_not_exist'. ** An error was encountered during execution of batch. Exiting.
The quickest way to automate the execution of my test script, is to use SQLCMD via the command line. Note the “-b” option used in my SQLCMD command string, which forces the termination of batch upon encountering errors. This is functionally similar to using “:on error exit” SQLCMD variable within the script itself. Here is the simple command line string ;
sqlcmd -S WKS18176\SANIL_2012 -d Demo -b -i test.sql -o test.sql.log.txt
When this SQLCMD command string is executed in the command prompt, it created the log file documenting the error message and the fact that batch was terminated .However, note that the command prompt screen shows no indication of success or failure of the script.
Unless our DBA opens up the log file “test.sql.log.txt” for review, he cannot see the execution and error messages as seen below. (I could use the “type” command on the next line here but we prefer to have a single line command )
Changed database context to 'Demo'. Deploying Demo Script... ----------- 12 (1 rows affected) Running query against non-existing table... Msg 208, Level 16, State 1, Server WKS18176\SANIL_2012, Line 2 Invalid object name 'dbo.does_not_exist'.
This is where PowerShell came to our rescue. With minor modification to my SQLCMD command itself, and adding a PowerShell cmd-let, we were able to not only log the execution messages into a file, but also display them on the PowerShell screen, without losing any functionality related to exiting the batch upon error.
sqlcmd -S WKS18176\SANIL_2012 -d Demo -b -i test.sql | Tee-Object -file test.sql.log.txt
Here is a screenshot of executing my test script via PowerShell.
This was my first time using PowerShell and I am impressed how quickly we were able to learn and use it. Over the next few weeks, I am going to take up exploring PowerShell and learn how I can apply it to ease some more of our automation pain points !
- SQLCMD – http://technet.microsoft.com/en-us/library/ms165702(v=sql.105).aspx
- PowerShell – http://msdn.microsoft.com/en-us/library/dd835506(v=vs.85).aspx
With only a 9 weeks left until the end of the year, I figured I should start getting serious about meeting some of my goals for 2013. I did successfully complete my first goal for 2013 – speaking at the PASS Summit last week. My second goal for the year is to achieve the MCSA: SQL Server 2012 certification. Since I am Microsoft Certified (MCTS) on SQL Server 2008, I will be taking the two exams to transition to MCSA SQL Server 2012 – the 70-457 & the 70-458, instead of the usual 3 exams. However, in the absence of any training materials specifically geared towards these transition exams, I am using the standard training kit from Microsoft Press, to augment my real work experience on SQL Server 2012. The Microsoft Press training kit consists of 3 books, 1 each for exam 70-461, 70-462 & 70-463. These training kits are available on Amazon . I also happen to have a subscription to PluralSight’s training library, which has over 8 hours of training videos for exam 70-461 by christopher Harrison and over 16 hours of training videos for exam 70-462 by Sean McCown .
I am hoping that the combination of these two learning resources will help me prepare for the MCSA: SQL Server 2012 exam in about 90 days ( just over 12 weeks). My training schedule is inspired by Microsoft’s “90 Days to MCSA: SQL Server 2012 Edition” program. I intend to continue writing a weekly blog post that documents my learning and progress towards achieving the MCSA: SQL Server 2012 Certification. Hope to hear a lot of feedback and support from #sqlfamily on this journey !
I am honored and excited to be selected to speak at the PASS Summit 2013 in Charlotte, NC – Oct 15th through 18th ! I will be talking about “Database Unit testing” with Visual Studio. This session highlights the importance of Unit Testing in the development life cycle of a Database application. Unit testing a Database application is definitely a lot more challenging than unit testing a VB.NET or C# application . Creating a consistent database test environment not only involves database code, but also the data itself. More often than not, due to the time and effort involved in creating a consistent database test environment, Unit Testing database code is rarely given a though upfront during the development. This usually leads to late discovery of bugs, that are expensive to fix as the development life cycle progresses. Visual Studio, with Database projects and more recently with SQL Server Data tools (SSDT), had made unit testing fairly easy to implement. During the course of this session, we will touch base with the the concepts of Unit Testing and demonstrate the implementation of Unit tests for a Database project and an SSDT project in VSTS 2010 and VSTS 2012 respectively. If you have already implemented Database Unit test projects in VSTS 2010, we will also go through a demo for upgrading them to SSDT.
I have presented this session at several SQL Saturday events, User group meetings and regional conferences, and I am looking forward to bring to this session to the PASS Summit. I look forward to seeing you all at the Summit in October !
We had great SQL Saturday #236, the second annual St.Louis SQL Saturday event, on Aug 3. As always, PASS plays a big role helping make any SQL Saturday event successful by providing the necessary infrastructure to run the event.
We moved the 2013 event to a different facility this year, the Wool Center at SLU. SLU provided the venue for this year’s event, as well as a few of their staff members to help us out on the day of the event. we could not have asked for any better. We definitely plan to continue host future St.Louis SQL Saturdays at SLU.
I would also like to thank the core team of organizers – Mike Lynn, Jay carter, Danielle Mayers and Kim Tessereau for putting in a lot of hard work to make this event possible. There’s also several volunteers who helped out at the registration desk, lunch line and classrooms, all of who deserve a big thanks. Organizing a successful SQL Saturday is definitely a team effort and I could not have asked for a better team for this event. No SQL Saturday event is possible with out the the speakers who contribute their time and skills , to present at the event. The generous support of all event sponsors plays an equally important role.
Last but the not the least, all the attendees who took the time to attend this event on a Saturday and are passionate about learning as well the SQL Community, deserve a big round of applause as well.
As organizers of the event, we noted a few improvements that can help us make the 2014 event even better;
1. Event Date – Quite a few of our regular local speakers, as well as several potential attendees could not make it to the event due to vacation plans. Several SQL Saturday organizers from the mid-west region had similar experiences in the months of July and August. We are planning for a event date in the month for September, for the 2014 St.Louis SQL Saturday.
2. Communication of Event start time and SpeedPASS – Though the first lecture for the day started at 9:30 AM and the registration desk opened at 8:30 AM, we had several attendees show up for the event before 8 AM. some of the sponsor representatives did not get the directions to the free parking lot We will definitely be much more clearer with our communication in the future. On the bright side, over 60% of the attendees came in with a printed SpeedPASS, which help the registration process move smoothly.
3. Lunch – We seem to have erred on the side of caution again while ordering lunch for the attendees, volunteers and sponsors. While we donated the left over lunch boxes to the building staff, we intend to plan the lunch orders better for the 2014 event.
4. After party – We intend to explore a venue closer to the SLU campus, for the after party for the 2014 event.
5. Recommended Hotel – While we were unable to secure a discount at the nearby hotels for the 2013 event, we intent to start negotiating with these hotel earlier for the 2014 event.
Please follow these links to the view the pictures taken during this event :
Please feel free to send us your feedback and suggestions to make the St.Louis SQL Saturday event better !
Please feel free to send us your feedback for the event.
The second annual St.Louis SQL Saturday is coming up in less than 2 weeks, on Aug 3rd 2013 at the Wood Center on SLU Campus (3545 Lindell Blvd, Center for Workforce & Organizational Development- 2nd floor,St. Louis, MO 63103). The event is a full day of free SQL Server training, consisting of 20 sessions on topics like Database Administration, Business Intelligence, Application Development and professional Development. Free parking for the event is available in the SLU parking lot across Olive St (Theresa Lot). NO Parking passes are needed event attendees to park in this lot. Street parking is at your own risk. The City of St.Louis metered parking spots are no longer free on the weekends.From the parking lot, follow the yard signs for SQL Saturday. Please follow this link for the floor plan of the venue. This will help your familiarize your self with where the registration desk, classrooms and the facilities are. Please note that our event is on the 2nd floor of the building
- Hotel Ignacio – 0.2 miles
- Courtyard St.Louis Downtown (Marriot) – 1 mile
- Pear Tree Inn Union Station (Drury) – 1.1 mile
On the Day of the event, the registration desk will open at 8:30 AM and the first lecture for the day starts at 9:30 AM. A few days before the event, all registered attendees will receive an email with a link to their SPEEDPASS. The first 50 attendees who sign in at the registration desk with a SPEEDPASS, will get a free SQL Saturday t-shirt. If any of your friends or colleagues are interested to attend the event, please do encourage them to register as soon as possible (on the spot registrations are accepted, but they lead to long lines and waiting times for attendees). Box lunches will be provided at the event for a nominal fee of $10, only to those attendees who pay the lunch fee in advance, before 29th July 2013 (an with payment link for lunch fees will be coming out shortly). We give our caterers the head count on the Tuesday before the event. During Lunch, several of our Gold level sponsors will be talking about their products and services, various classrooms. The “Women In Technology” Panel talk will also be held during the lunch break, in one of the classrooms. Please visit the event Schedule page for details.
The SQL Saturday event are made possible thanks to the contributions of speakers & volunteers, and generous support of the vendors. Please do thank the speakers and volunteers for all of work they put in to host these events. Several sponsors set up booths at the event and offer raffle prizes to the attendees. Do stop by their booths – they are always excited to talk to you about their products and services. If you would like a chance to win one of their Raffle prizes, please drop your raffle tickets at their booths. All Raffle prizes will be drawn at the end of the day. You must be present to win, and each winner can win only one raffle prize. PluralSight has offered a raffle prize of a free one year subscription to their entire training library (worth $299), and your attendee tickets dropped at the registration desk will qualify you to participate in the raffle for this prize. There will be several vendor gifts as well as books to Raffle away. xSQL Software is offering all attendees of this event, a free license of their “xSQL Data Compare for SQL Server” ($349 value). Please follow this link for details. This offer is valid ONLY on Aug 3rd and 4th of 2012. We have planned for an informal gathering after the event (the after party) at Schlafly Bottleworks. Please note that the event organizers are only suggesting a venue for all the attendees get together. All individuals are responsible for finding their own tables and paying for their own food and beverages. Please see this link for after party details.
As usual, we do request all attendees to be respectful of the venue and their property. SLU has generously offered the use of their facilities for this event and we definitely want them to continue their support to our event, for many years to follow.
If you need any more reasons to convince your friends or co-workers to attend the St.Louis SQL Satutday, please read kathi’s blog on top 10 reasons to attend SQL Saturday . Hope to see you all on Aug 3rd 2013 for an awesome STL SQL Satuday !
SQL Saturday #214 , Louisville KY , is coming up this weekend (July 13,2013) and I am looking forward to a trip to my favorite city in the mid-west. I was selected to present at the 2012 Louisville SQL Saturday and it was one of best SQL Saturdays I have attended so far (they get extra points for keeping ice-cream in the speaker’s lounge !). I am excited to be chosen to present at the 2013 Louisville SQL Saturday . I will be talking about Service Broker , and my session is scheduled for 1 PM in Room #3. This is definitely my favorite topic and I started my life as speaker by presenting this topic at the 2011 Kansas City SQL Saturday event. Over time, I have kept this session updated with advancements in SQL Server 2012 and the feedback I have received over the numerous events in the last 2 years.
So if you like to learn about service broker, understand its applications in real life situations and learn how to implement and troubleshoot service broker applications (with plenty of demos!), I look forward to seeing you at my session this Saturday.
Spring 2013 is shaping up to be a busy season for talking about SQL Server. So far I have an exciting line up of five sessions, scheduled for March and April of 2013
- On 3/8/2013 at noon MST, I will be talking about Service Broker, with the PASS Application Development Virtual Chapter. I am really looking forward to this one, only only because it will be first talk with a PASS Virtual Chapter, but also because the topic “Service Broker” is one of my favorites. This was the topic of my very first presentation at the SQL Saturday in Kansas City, back in 2010.
- On 3/9/2013 , I will be talking about Database Unit Testing with Visual Studio, at the Greater Midwest SQL Relay. Oakwood Systems organizes this annual conference in Spring and its a huge hit in St.Louis. If you are in the area, please do make time to attend this free full day event of top notch SQL Server Training.
- On 3/11/2013, I will be talking about Parameter Sniffing at the Capital Area SQL Server User group (PASS Local Chapter) in Albany, NY . I especially like this talk because I had learned about parameter sniffing while trying to tune a Query that was suffering from intermittent performance issues. Right around that time, I happened to come across Grant Fritchey’s chapter on the same topic in the MVP Deep Dives Vol.2 book . The content there in was so awesome, I decided to make a presentation out of it, and it has been a popular topic at several SQL Saturdays in the Mid-West region.
- On 3/26/2013, I will be talking about Database Unit Testing with VSTS, at the St.Louis Metro East .NET User group in O Fallon, IL . The first time I talked about this topic was at the St.Louis Days of .NET in Aug 2012 and I have noticed a renewed interest in this topic since the release of VSTS 2012.
- On 4/13/2013, I will be talking about Database Unit Testing with VSTS, at SQL Saturday #211 in Chicago, IL . SQL Saturday #31 (Chicago) in April of 2010 was the very first SQL Saturday I attended and it got me involved with the SQL Server Community. Its a huge honor for me to be selected to speak at the 2013 Chicago SQL Saturday and I am looking forward to this trip !
Understanding the numerous types of variable SQL Server has to offer, and their appropriate usage is one of the cornerstones of developing effective database code. I recently helped a co-worker fix an error message in his code.
-- Error Msg 137, Level 15, State 2, Line 3 Must declare the scalar variable "@l_INT".
The code snippet looked something like this, which led to a discussion about scope of T-SQL variables and an interesting way to fix this problem.
CREATE TABLE #scope_test1 (col1 INT); CREATE TABLE #scope_test2 (col1 INT); GO DECLARE @l_INT INT = 42; INSERT INTO #scope_test1 (col1) SELECT @l_INT; GO INSERT INTO #scope_test2 (col1) SELECT @l_INT; GO
T-SQL language supports local variables (their names begin with a single @). The scope of a variable is the range of Transact-SQL statements that can reference the variable. The scope of a variable lasts from the point it is declared until the end of the batch or stored procedure in which it is declared. A T-SQL batch is a group of statements that SQL Server parses as a single unit. A batch can be delimited by the BEGIN..END statements, or a client tool like SQL Server Management Studio (SSMS) can specify the end of a batch with a GO command (you can set any word to be a batch separator in SSMS, but we will leave that discussion for another time). The names of some Transact-SQL system functions begin with two at signs (@@). They are commonly referred to as Global variables. Global variable names begin with a @@ prefix. You do not need to declare them, since the server constantly maintains them. They are system-defined functions not variables and do not have the same behaviors as variables. All the global variables represent information specific to the server or a current user sessions. Some of the commonly used ones are @@ERROR, @@IDENTITY, @@VERSION. Traditionally, DECLARE command is used to declare a local variable and SET or SELECT used to initialize its value. With SQL Server 2012, both of these tasks can be accomplished in a single statement ;
DECLARE @MyCounter INT = 12 ;
The variables we discussed so far are scalar variables, which can hold a single data value of a specific type. A table variable is a special data type that can be used to store a result set for processing at a later time. table is primarily used for temporary storage of a set of rows returned as the result set of a table-valued function. Functions and variables can be declared to be of type table. table variables can be used in functions, stored procedures, and batches. A DECLARE statement (similar to local scalar variables) is used to DECLARE a table variable. While they behave exactly like local variables, with a well defined scope,they can be thought of as being similar to temporary table, but with several limitations.
The SQLCMD utility lets you enter Transact-SQL statements, system procedures, and script files at the command prompt, in Query Editor in SQLCMD mode, in a Windows script file or in an operating system (Cmd.exe) job step of a SQL Server Agent job. This utility uses ODBC to execute Transact-SQL batches. Scripting variables can be used in SQLCMD scripts. Scripting variables enable one script to be used in multiple scenarios. The setvar command is used to define scripting variables. Variables that are defined by using the setvar command are stored internally. Scripting variables should not be confused with environment variables that are defined at the command prompt by using SET.
The scope of a SQLCMD scripting variable can span several batches, which can be used to implement variables that don’t go out of scope even when a batch ends. The example below is a simplified demo of the same ;
:setvar l_int "42" CREATE TABLE #scope_test1 (col1 INT); CREATE TABLE #scope_test2 (col1 INT); GO INSERT INTO #scope_test1 (col1) SELECT $(l_int); GO INSERT INTO #scope_test2 (col1) SELECT $(l_int); GO -- Results (1 row(s) affected) (1 row(s) affected)
A review of the types of variables the T-SQL language has to offer, has helped refresh my understanding of their appropriate usage.
- Table Variables – http://msdn.microsoft.com/en-us/library/ms175010.aspx
- T-SQL Variables – http://msdn.microsoft.com/en-us/library/ms187953(v=sql.105).aspx
- Global Variables – http://www.codeproject.com/Articles/39131/Global-Variables-in-SQL-Server
- Scripting variables with SQLCMD – http://msdn.microsoft.com/en-us/library/ms188714.aspx
- SQLCMD utility – http://msdn.microsoft.com/en-us/library/ms162773.aspx
BIBrews is a unique opportunity for like-minded technology experts to informally gather for drinks and food and discuss the pressing questions around Business Intelligence and SQL technical trends and experiences. The content is unrehearsed and open. BIBrews has a strict “no sales” and “no recruiting” policy.
The first meeting of STL BIBrews is scheduled for Thursday Dec 13,2012 from 5:30 PM to 8:30 PM (CST) in St.Louis, MO. Please register at EventBrite using this link – http://bibrewsstl.eventbrite.com/ . You can follow Scott Shaw on twitter (@shawsql) and/or @BIBrews for latest updates.