Migrate SQL Server Database to Azure SQL Database using Data Migration Assistant – final steps

The first two steps can be found at http://thedataengineer.com/2018/08/29/migrate-sql-server-to-azure-sql-server/

Part 3: Migration Assistant

Open the Data Migration Assistant and click on +New to start a new Migration

Choose Migration

Fill in the project name and keep the defaults for the other options and click Create


On the first screen, fill in the Server name and make your choice connection properties and click connect. Reminder the credentials used to connect must have Control Server permission


After connecting, choose the Source database. Keep Assess the database before Migration checked and click next


On the select target screen you have the option to either use an existing server to connect to and choose the target database or you can click on Create a new Azure SQL Database, which will take you to the Azure portal to create the new database.

I went out to the Azure portal and created a new server and database for the migration. I then came back in to the DMA window and filled out the info requested and clicked connect. I am also using SQL Server Authentication to connect.


Choose the database that is the target and click next


Select the objects you want to migrate

Check the Assessment and determine if you can fix the issue or uncheck the issue if you wish to not migrate the blocks


I also had an issue that I could not migrate some of the users, as they were in the form domain\user I just left these unchecked and will manually add any users to the new Azure SQL Server Database


Click generate sql script. This may take a little time to run

Once completed, review the script. When you are satisfied with the script, click deploy schema.


The following screen will appear


On completion, check for errors and determine if you can move on. Once you are satisfied with the schema deploy, click Migrate Data


Check that all of your tables are marked as OK. Also, you can see at the top, that Microsoft strongly recommends that you temporarily change your Azure SQL database to performance level P15during the migration process. This database that I am migrating is so small, that I am going to leave it as is. Click Start data migration


The in-progress screen appears. You can also monitor some of the activity in the Azure portal. For instance, Resource Utilization and used space


Once the data migration has completed you will see at the bottom right that Migration is complete and the Duration


I then connected to the Azure SQL database using SQL Server Management Studio and ran some comparisons to the original source and detected no differences.

Reminder, add any users in to the Azure SQL Server Database that were not automatically migrated.

You have now completed the migration.

 

 

 

Migrate SQL Server Database to Azure SQL Database using Data Migration Assistant

We will be using the Data Migration Assistant to assess and then migrate an on-prem SQL Server 2012 SP2 database to an Azure SQL Database

Part 1: Download and install Data Migration Assistant

To install DMA, download the latest version of the tool from the Microsoft Download Center, and then run the DataMigrationAssistant.msi file.

Click next

Accept the license agreement

Click install

Wait for the install to complete

Check the box beside Launch Microsoft Data Migration Assistant and click Finish

Part 2: Open Data Migration Assistant and create assessment

We will now start with assessing our on-prem SQL Server

  1. Click the +New icon and select Assessment project type
  2. Give your assessment a name
  3. We will leave the Source server type and Target server type as is.
  4. Click create

I left the defaults checked and clicked next

Fill in the server name and authentication type. *Notice that the credentials used to connect must be a member of the sysadmin role

Click connect

Choose the sources and click add. I am only interested in the CentralDB database at this time

If you are satisfied with the sources that you have added, click Start Assessment

Obviously, the assessment run time depends on how many and what size of the sources

When the assessment completes

You can see the results on the screen or you can export the report as a json or csv file.

In this example, you can see that there are 3 Unsupported features and 1 Partially supported feature.

With each issue the details section will give the impact. Recommendation and more info

One last screenshot to show the assessment options

You can open, restart or delete an assessment

In part 3, we will look at Migrating a SQL Server database to Azure SQL Database.

 

 

 

 

Geo-Replication in Azure

I was recently asked to look at creating a read-only copy of a SQL Server database in China, with the source being in the UK. . For on-premise, to get AlwaysOnAvailabilty with the replica being a read only copy we would need to be running Windows FailOver Clustering Server version and the database would need to be Enterprise Edition. However, this is easily set up in Azure, using Geo Replication and copies would only be seconds apart in replication.

I decided to do some testing. Geo replication is available at all price points in Azure, so for testing, I spun up a database in the Basic Pricing tier in the West US region. Once the database was available, I created a simple table with 4 rows of data and then confirmed the data existed. The next step was to go back in to Azure and under the newly created database resource, I clicked Geo-Replication. This opened another window that allowed me to choose a location. The picture above, shows all of the available locations. The blue hexagon is your primary database and the green hexagons are the available replication locations. For this one, I chose East Asia. The cost for this Geo-Replicated readable copy is the same as the primary copy. So in my case, each database was $5 per month.

 

The Geo-Replicated database will have the same database instance name, but will have a different server name.

As part of creating the new readable database, Azure will copy all of the data that is in the primary to the readable. Since I didn’t have much data, this process only took a minute or so. Once I got the notification that the readable was created, I connected via SSMS and was able to see the data. I then did some basic testing of deletes, inserts and updates. Obviously, the amount of data will make a difference, but as for speed, as soon as I hit execute in the primary database window and refreshed my select on the readable copy, the changes were there.

If you are looking for a read only copy of a SQL Server database, I definitely recommend Azure Geo-Replication. The whole process took less than 10 minutes to set up.

 

 

Loading Linux command line logs in to Windows SQL Server

We recently decided, for auditing reasons, that we would like to keep a record of all of the commands that were run on our linux machines and because of separation of duties, we would load those logs in to a SQL Server instance on Windows. Funny how I feel the need to avoid confusion and make sure I say were the SQL Server instance is.

1. First, to be able to get the files I needed to be able to ssh to the linux servers. The easiest way to do this was using Putty

I have separated out the two scripts. First one being the script to download the logs to Windows server and the second script is to load logs to SQL Server

2. Download logs to Windows Server

Since I am connecting to numerous linux servers, I am using a file to store the user names, passwords and server names and will iterate through each line in the file to connect and get the logs.

The files manager.txt and others.txt only contain the password used by that user.

Example server name file

 3. Script to get server names, user names and passwords

 

4. Create SQL Server table with the following columns – filename, command, workingDirectory, server, username and date

5. Load logs in to SQL Server

Example log file from Linux – As you can see, all of the commands came in as comma separated, but they were all on one line.

 

Using the Get-Content command changed the file to a usable format, we used the replace command to remove the space and the command number and we used Set-Content command to save this as a txt file

 

6. SQL Server table

 

 

 

SQL Server Reporting Services Timeout

The other day we ran in to an issue where the report was running 30 minutes, this may be the only report that we have that runs this long, but then it fails. The only error message was “Report processing has been canceled by the user. (rsProcessingAborted), but after checking, no one canceled the report processing. What just happened?

It was determined that the Report timeout option was set to “Use the default setting” like most of our reports are.

 

Well, where is this default setting and what is it set to? The default settings timeout is set in the site settings page of your SQL Services Reporting Services and it was set to 1800 seconds or as you guessed, 30 minutes.

 

You have two options to fix the the timeout issue.

1. Change the default settings timeout for the whole site in the site settings page.

2. Change the Report timeout setting just for that report. This is done by going to the report, clicking the … and choosing manage.

You then scroll down to Advanced and either choose “Allow the report to run for (add amount of seconds here) seconds before timing out” and changing the seconds to a higher value that will allow the report to run or choose the “Allow the report to run indefinitely (no timeout)” option.

 

Click Apply and your report should not complete.

 

 

 

Connect to SQL Server when you don’t know SA account or don’t have sysadmin access

Below are the steps you need to perform to grant SYSADMIN access to a user in SQL Server in case you are completely locked out.

1. Download PSexec to connect using SQL Server Management Studio using the NT Authority\System

Download PsTools from https://download.sysinternals.com/files/PSTools.zip
Unzip the content and copy PsExec.exe to C:\Windows\System32

2. Stop the SQL Server and SQL Server Agent services on the server.

3. Open a cmd prompt window as administrator and navigate to SQL Server’s Binn directory. You may need to adjust your path based on your install location.

ex. C:\Program Files\Microsoft SQL Server\MSSQL11\MSSQL\Binn

4. Once you are in SQL Server’s Binn directory run the ‘sqlservr -m’ command to start SQL Server in single user mode as shown below. Had to add the location of the ERRORLOG

sqlservr -m -e C:\Program Files\Microsoft SQL Server\MSSQL11\MSSQL\Log\ERRORLOG;

If it’s a named instance:
sqlservr -m -s <instancename> -e C:\Program Files\Microsoft SQL Server\MSSQL11.SERVICECORE\MSSQL\Log\ERRORLOG;

After the SQL Server instance was started in single user mode, I was receiving logon errors form other users that were trying to logon from the application, when it was restarted in single user mode. I ignored the errors and moved on.

5. Execute PsExec – may need to adjust the ” marks and file locations

run cmd as administrator

PsExec -s -i “C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio\Ssms.exe”

The above command will launch SQL Server Management Studio and gives you a “Connect to Server” window and the User Name will be pre-populated with NT AUTHORITY\SYSTEM

6. Click Connect and then go in to Security > Logins and add your account as a sysadmin

7. restart the sql server services

 

 

 

2018’s list of goals

Many items peak my interest and that can sometimes make it difficult to choose a few that I would like to gain a deeper learning of, but I think my current role will guide me in 2018.

In no particular order:

SQL Server on Linux: This is a no brainer. As long as I can remember, I have preferred to work with command line over gui based tools. I’ve always felt like I have more control over what I am doing. And now that SQL Server is on Linux, I want to dig in and learn whatever I can. I think I also like the idea, that there aren’t a lot of SQL Server DBA’s that are familiar with Linux.

Cloud: I’ve played around in the cloud for years, but with little reason. Between Azure options and the Oracle Cloud, it’s time to start looking at the differences and see where each can help me.

Power BI: With Power BI Premium, I think this is a game changer for Corporate adoption.

Make it to PASS Summit: I’ve been to Oracle Open World 3 times, but do to job responsibilities or other co-workers training needs, I have unfortunately never made it to PASS. I don’t mind Open World, but more and more I can see PASS being more beneficial.

Oracle Open World and PASS Summit

A little less than a month ago I attended Oracle Open World in San Francisco. It’s been a few years since I last attended the conference. Although San Francisco seems to have changed the conference has stayed the same. Numerous days of sessions that are very informative. As I felt the last time I attended, many of the sessions seem like marketing sessions, this isn’t a bad thing, just different. I did really enjoy some sessions dealing with Oracle EBS, Oracle Cloud and DBA topics. One session “Navigating your DBA Career in the Oracle Cloud” by Craig Shallahamer from orapub.com was very eye opening. Enough so, that I have started to look more closely at my future.

PASS Summit is going on right now and unfortunately, I am not in attendance. However, thanks to PASSTV, I have been able to watch numerous days worth of keynotes and sessions. This is definitely a conference that I would like to attend in the future. I know a good amount of people that attend every year and love this conference.

Combination of Both Worlds

I just recently came back to my blog and noticed that my last post stated that I was going back to Oracle. I did, but the fit with that employer was not the best and I moved on to another company at the beginning of 2016. I couldn’t be happier with that move in 2016. I get to work with SQL Server and Oracle along with other software packages. Some items that I am working on or researching and would like to blog about in the future, Automic Scheduling software, SQL Server on Linux, Azure, Azure Analysis Services, Business Intelligence, Power BI, Oracle Data Guard, Oracle Cloud and other various items.

Going back to Oracle

I am thrilled to say that after 4 years of working with Microsoft SQL Server and the BI Tool Stack, I am going back to working with Oracle. I enjoyed working with Microsoft and getting to add another tool in my tool belt, but I am very excited about this new opportunity.

Hopefully this opportunity will lend itself to more blog posts.

Stay tuned for more details.