sql – Proxmedia https://pxm-software.com Software Outsourcing & Nearshore Development Mon, 27 Feb 2017 14:01:08 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.2 SQL Server Database Index optimization – maintenance plan https://pxm-software.com/sql-server-database-index-optimization-maintenance-plan/ Wed, 05 Aug 2015 20:54:25 +0000 https://proxmedia.pl:99/en/?p=4191 Properly optimized table indexes are crucial in database performance tuning. This is especially true when the database is growing overtime causing indexes to be fragmented. The more an index is fragmented, the longer time is needed for database reads. When should the indexes be rebuilt? Microsoft recommends to reorganize indexes when fragmentation level is between […]

Artykuł SQL Server Database Index optimization – maintenance plan pochodzi z serwisu Proxmedia.

]]>
Properly optimized table indexes are crucial in database performance tuning. This is especially true when the database is growing overtime causing indexes to be fragmented. The more an index is fragmented, the longer time is needed for database reads.

When should the indexes be rebuilt? Microsoft recommends to reorganize indexes when fragmentation level is between 5% and 30%. When fragmentation is above 30% then the index should be rebuilt.

The best way of ensuring low fragmentation levels is to setup maintenance plan for the database that will be triggered periodically (preferably at night). The script below can be used in scheduled job. You can use @PrintInformationOnly param to print the index information first, for all database tables. When you set it to 0, the actual rebuild or reorganize process will start. I have also added @ExcludeTableList to exclude some tables that should be skipped (for example are too big or not used).

DECLARE	@return_value int

EXEC	@return_value = [dbo].[IndexOptimize]
		@databaseName = N'myDatabase',
		@IndexFragmentationMin = 5,
		@IndexFragmentationRebuildMin = 30,
		@PrintInformationOnly = 1

SELECT	'Return Value' = @return_value

The actual script starts here. You may want to adjust FILLFACTOR params separately for each tables as this should be configured depending on data update frequency. By setting MAXDOP= 1 param, we disable parallel threads during rebuild process, so the server CPU will not go to high.

I have successfully tested this script on database of size about 150 GB. Be aware that initial index rebuild may take a few hours, depending on fragmentation levels.

 
CREATE PROCEDURE [dbo].[IndexOptimize] 
@databaseName varchar(200) = 'MyTable',
@IndexFragmentationMin int = 5,
@IndexFragmentationRebuildMin int = 30,
@skipLargeTables bit = 0,
@ExcludeTableList nvarchar(max) = 'MyBigTable1,MyBigTable2',
@PrintInformationOnly bit = 0
AS
BEGIN

BEGIN TRY
	declare @dbid int = 0;
	set @dbid = (select [dbid] from [Master].dbo.[sysdatabases] where name = @databaseName);
    ------------------
	DECLARE @Schema VARCHAR(200);  
	DECLARE @Table NVARCHAR(200);  
	DECLARE @Index VARCHAR(200);  
	DECLARE @avg_fragmentation_in_percent int; 
	DECLARE @page_count int; 

	DECLARE db_cursor CURSOR FOR  
		SELECT 
		dbschemas.[name] as 'Schema', 
		dbtables.[name] as 'Table', 
		dbindexes.[name] as 'Index',
		indexstats.avg_fragmentation_in_percent,
		indexstats.page_count
		FROM sys.dm_db_index_physical_stats (@dbid, NULL, NULL, NULL, NULL) AS indexstats
		INNER JOIN sys.tables dbtables on dbtables.[object_id] = indexstats.[object_id]
		INNER JOIN sys.schemas dbschemas on dbtables.[schema_id] = dbschemas.[schema_id]
		INNER JOIN sys.indexes AS dbindexes ON dbindexes.[object_id] = indexstats.[object_id]
		AND indexstats.index_id = dbindexes.index_id
		WHERE dbindexes.[name] is not null and indexstats.database_id = @dbid  
		ORDER BY indexstats.avg_fragmentation_in_percent desc

	OPEN db_cursor   
	FETCH NEXT FROM db_cursor INTO @Schema, @Table, @Index, @avg_fragmentation_in_percent, @page_count 

	WHILE @@FETCH_STATUS = 0   
	BEGIN  
	   if(@skipLargeTables = 1 and CHARINDEX(@Table, @ExcludeTableList) > 0)
	   begin
		 print 'skipped index (' + CONVERT(varchar(10),@avg_fragmentation_in_percent) + '%) ' + @Index + ' for table ' + @Table + ', page count ' + CONVERT(varchar(10),@page_count)
	   end
	   else
	   begin
			if(@avg_fragmentation_in_percent <= @IndexFragmentationRebuildMin and @avg_fragmentation_in_percent > @IndexFragmentationMin and LEN(@Index) > 3)
			begin
			   print 'reorganizing index (' + CONVERT(varchar(10),@avg_fragmentation_in_percent) + '%) ' + @Index + ' for table ' + @Table + ', page count ' + CONVERT(varchar(10),@page_count)
			  
			   if(@PrintInformationOnly = 0)
			   begin
			      exec('ALTER INDEX ' + @Index + ' ON dbo.' + @Table + ' REORGANIZE;');
			   end
			end
		    
			if(@avg_fragmentation_in_percent > @IndexFragmentationRebuildMin and LEN(@Index) > 3)
			begin
			   print 'rebuilding index (' + CONVERT(varchar(10),@avg_fragmentation_in_percent) + '%) ' + @Index + ' for table ' + @Table + ', page count ' + CONVERT(varchar(10),@page_count)
			  
			   if(@PrintInformationOnly = 0)
			   begin
			      exec('ALTER INDEX ' + @Index + ' ON dbo.' + @Table + ' REBUILD WITH (MAXDOP=1,FILLFACTOR = 90)');
			   end
			end
		end
			
		FETCH NEXT FROM db_cursor INTO @Schema, @Table, @Index, @avg_fragmentation_in_percent, @page_count  
	END   

	CLOSE db_cursor   
	DEALLOCATE db_cursor
	
	END TRY 
BEGIN CATCH	
	DECLARE @ErrorMessage NVARCHAR(4000);
	DECLARE @ErrorSeverity INT;
	DECLARE @ErrorState INT;
	SELECT @ErrorMessage = 'Message: '+ ERROR_MESSAGE() + CHAR(13) + CHAR(10) + 
						   'Procedure: ' + CAST(ERROR_PROCEDURE() AS VARCHAR(MAX)) + CHAR(13) + CHAR(10) +  
						   'Line: ' + CAST(ERROR_LINE() AS VARCHAR(MAX)) + CHAR(13) + CHAR(10) +  
						   'Number: ' + CAST(ERROR_NUMBER() AS VARCHAR(MAX)),  @ErrorSeverity = ERROR_SEVERITY(), @ErrorState = ERROR_STATE();	

	RAISERROR (@ErrorMessage, @ErrorSeverity, @ErrorState);
	RETURN 0
END CATCH

END

Microsoft-SQL-Server

Artykuł SQL Server Database Index optimization – maintenance plan pochodzi z serwisu Proxmedia.

]]>
Using Power-Shell in automated tasks (batch script alternative) https://pxm-software.com/using-power-shell-in-automated-tasks-batch-script-alternative/ Sun, 28 Jun 2015 20:16:10 +0000 https://proxmedia.pl:99/en/?p=4136 Power Shell it’s the Microsoft’s Automation framework sitting on top of .Net (System.Management.Automation) that lets us perform advanced automated tasks. It can help not only the administrators but also the programmers to save a lot of time by automating repeatable processes. The script’s syntax is quite intuitive and easy to learn so it allows users […]

Artykuł Using Power-Shell in automated tasks (batch script alternative) pochodzi z serwisu Proxmedia.

]]>
Power Shell it’s the Microsoft’s Automation framework sitting on top of .Net (System.Management.Automation) that lets us perform advanced automated tasks. It can help not only the administrators but also the programmers to save a lot of time by automating repeatable processes. The script’s syntax is quite intuitive and easy to learn so it allows users already writing e.g. batch scripts to grasp it very fast.

If we have for example an old batch file that executes sql scripts stored in the folder as follows:

 @ECHO OFF
 echo script folder: %1
 echo server: %2
 echo database: %3
 echo database user: %4
 echo ----------------

setlocal enabledelayedexpansion
cd /d %1

for %%d in (*.sql) do (
   set filecontent=
   for /f "delims=" %%a in (%%d) do (set "filecontent=!filecontent!%%a")
   set "temp=!filecontent!"
   set "filecontent=!temp:#PARAM#=%4!"
   echo executing %%d
   echo ----------
   sqlcmd -S%2 -E -Q"!filecontent!" -d%3 
)
pause

then the above script can be easily replaced with Power-Shell. Using PS gives us the same flexibility of scripting languages and at the same time allows us to interact with .Net framework when required.

Before we can start executing PS scripts on our computer, we need to enable execution which is disabled by default. We can do it using following cmdlets:

 //enable executing 
 Set-ExecutionPolicy RemoteSigned

 //get back to default settings
 Set-ExecutionPolicy Default

After we’ve created test.ps1 file, we can start implementation. In order to make testing easier we can define default params inside the script using param(…) constructor. Please note that by using [string] keyword we actually creating strongly typed objects. Defined parameters will be overwritten by the user’s passed-in params.

ps-script

The first command (dir $ScriptFolder) just lists all files within the directory, then ForEach-Object function iterates through all files getting their content into $query variable. Next the $query text is being replaced with our parameter if we have any. The final task is just to execute SQLCMD command passing in query and database connection settings.

param (
[string]$ScriptFolder = "Database\Procedures",
[string]$Server = ".\SQL2008R2",
[string]$Database = "MyDb",
[string]$DatabaseUser = "user"
)


(dir $ScriptFolder) | 
  ForEach-Object { 
    $query = get-content $_.FullName -Raw
    $query = $query.Replace("#PARAM#","user");
    SQLCMD -S $Server -E -Q $query -d $Database
   }

In order to run our Power-Shell script we need just execute following line:

 //command line
 powershell -noexit C:\test.ps1 -ScriptFolder "C:\Database\Procedures" -Server ".\SQL2008R2" -Database "MyDB" -DatabaseUser "user"

Good luck with your Power-Shell scripting 🙂

Artykuł Using Power-Shell in automated tasks (batch script alternative) pochodzi z serwisu Proxmedia.

]]>
Deploying website files to multiple folders with exclusions https://pxm-software.com/deploying-website-files-to-multiple-folders-with-exclusions/ Tue, 26 May 2015 19:05:31 +0000 https://proxmedia.pl:99/en/?p=4055 When automating deployments it is sometimes needed to create copy of your website/folder in different locations with some excluded files e.g. configuration files or temp folders. The script below will help you to achieve that. It can be triggered automatically on your post-deployment action (possibly in your CI server) or executed manually. Script just copies […]

Artykuł Deploying website files to multiple folders with exclusions pochodzi z serwisu Proxmedia.

]]>
When automating deployments it is sometimes needed to create copy of your website/folder in different locations with some excluded files e.g. configuration files or temp folders.

The script below will help you to achieve that. It can be triggered automatically on your post-deployment action (possibly in your CI server) or executed manually. Script just copies all files from source folder into multiple locations, except for the files and sub-folders you specify. Files with the same names will be overridden in target locations.

Copy_websites

param (
##main paths
[string]$Source_wwwroot = "c:\inetpub\wwwroot\sourceApp",
[string]$Copy1_wwwroot = "c:\inetpub\wwwroot\copy1App",
[string]$Copy2_wwwroot = "c:\inetpub\wwwroot\copy2App",
##copy exceptions
$exclude = @("Web.config","Web.Debug.config","Web.Release.config"),
$excludeMatch = @("TempFiles","TempImages","obj")
)

if(-not(Test-Path $Source_wwwroot)) { "Source folder not found!"; return;}
if(-not(Test-Path $Copy1_wwwroot)) { "Copy1 folder not found!"; return;}
if(-not(Test-Path $Copy2_wwwroot)) { "Copy2 folder not found!"; return;}

If (-NOT ([Security.Principal.WindowsPrincipal][Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator"))
{ 
  "You need to run this script as admin!";
  sleep 3000
  return;
}

Write-Host "`n#######################`nThis script will copy files from: `n$Source_wwwroot `nto: `n$Copy1_wwwroot and`n$Copy2_wwwroot `n";

Write-Host "except files: $exclude";
Write-Host "except folders: $excludeMatch";

Read-host -prompt "`n#######################`nPress enter to start copying the files"

function CopyFiles($from, $to){
    [regex] $excludeMatchRegEx = ‘(?i)‘ + (($excludeMatch |foreach {[regex]::escape($_)}) –join “|”) + ‘’
    Get-ChildItem -Path $from -Recurse -Exclude $exclude | 
     where { $excludeMatch -eq $null -or $_.FullName.Replace($from, "") -notmatch $excludeMatchRegEx} |
     Copy-Item -Destination {
      if ($_.PSIsContainer) {
       Join-Path $to $_.Parent.FullName.Substring($from.length)
      } else {
       Join-Path $to $_.FullName.Substring($from.length)
      }
     } -Force -Exclude $exclude
 }

 "copying files for copy1..."
 CopyFiles $Source_wwwroot $Copy1_wwwroot

 "copying files for copy2..."
 CopyFiles $Source_wwwroot $Copy2_wwwroot

"Done!";
sleep 10000

Enjoy!
Copy_websites

Artykuł Deploying website files to multiple folders with exclusions pochodzi z serwisu Proxmedia.

]]>
Live database restoration to test server using Powershell https://pxm-software.com/live-database-restoration-to-test-server-using-powershell/ Tue, 24 Mar 2015 20:39:26 +0000 https://proxmedia.pl:99/en/?p=4167 It is very common requirement during development stage to copy live database to test server on the daily/weekly basis. In such a scenario there is also very often a need to clean the database and perform some operations before restoring it in test environment. For example if the live database is very big and you […]

Artykuł Live database restoration to test server using Powershell pochodzi z serwisu Proxmedia.

]]>
It is very common requirement during development stage to copy live database to test server on the daily/weekly basis. In such a scenario there is also very often a need to clean the database and perform some operations before restoring it in test environment.

For example if the live database is very big and you don’t need to have all tables on your test server, you can drop it during the copy process.
Similar applies if you need to adjust permissions, change the database name, optimize indexes or just shrink the database file.

CopyLiveDBToTest-300x137

The script below will help you automate the copy database process. On every run, it will check for the latest database backup file in defined location.
Next, it will copy that file to your local temp folder. After that, it will restore the database by giving it a temp name, so you can continue working while the restoration lasts (useful for big databases).
When the database is restored, it will drop the old test database and rename the current one to the same name. Also the old database files will be deleted to free-up the disc space. The database files will have the names like “TestDB_19_08_2014.mdf”.

As a result, you will have up-to-date live database restored on your test server.

 param (
#local test datatabase server to restore your live database 
[string]$TargetSQLServer= ".\MSSQLSERVER_2012",

#name of your live database
[string]$LiveSQLDatabase= "TestDB",

#name of your final test database being set after restoration
[string]$TargetSQLDatabase= "TestDB_DEV",

#data root of mssql server - containing .mdf files of your test database
[string]$TargetSQLDatabaseDataRoot= "C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER_2012\MSSQL\DATA",

#folder where current backup file will be copied and processed - must be local folder
[string]$TempLocalFolder= "C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER_2012\MSSQL\Backup_temp",

#backup location of your live database - can be located on the network drive
[string]$BackupLiveFolder = "C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER_2012\MSSQL\Backup",

#show copy progress or copy in bacground
[bool]$silentCopying = 0,

#log location
[string]$Logfile = "C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER_2012\MSSQL\Log\CopyLiveDBToTestLog.txt"
)

Function LogWrite
{
   Param ([string]$logstring)
   $currentdate = get-date -format "M-d-yyyy HH:mm:ss"
   Add-content $Logfile -value "$currentdate | $logstring"
   write-host "$currentdate | $logstring"
}

try
{ 
    LogWrite "acquiring backup file..."
    
    #get most recent backup file
    $liveBackupFile  = (dir -Path $BackupLiveFolder) | where {$_.extension -eq ".bak" }  | 
       sort lastwritetime –Descending | select -first 1;
      
    LogWrite "validating paths..."
    
    if($liveBackupFile -eq $null -or -not (Test-path $liveBackupFile.FullName)){ 
       LogWrite "Could not find the backup file, check the backup folder location!";
       return;}

    if(-not(Test-path $TempLocalFolder)) { LogWrite "Incorrect local backup location!"; return;}

    #copy backup file to temp location
    if(-not(Test-Path "$TempLocalFolder\$liveBackupFile"))
    {
       #remove all backups in temp location   
        dir -path $TempLocalFolder |  Foreach-Object {
        get-childitem -Path $_.FullName |  
         where-object { 
            ($_.FullName.tolower().EndsWith(".bak") -and $_.FullName.tolower().contains($LiveSQLDatabase.ToLower()))
          } | 
         Foreach-Object {
            LogWrite "removing file: $_.FullName"
            remove-item -path $_.FullName
          } 
        }

        LogWrite "copying backup file $liveBackupFile ..."
        
        if(-not($silentCopying -eq 1))
        {
            #use shell to see progress
            $FOF_CREATEPROGRESSDLG = "&H0&"
            $objShell = New-Object -ComObject "Shell.Application"
            $objFolder = $objShell.NameSpace($TempLocalFolder) 
            $objFolder.CopyHere($liveBackupFile.FullName, $FOF_CREATEPROGRESSDLG)
        }
        else
        {
           copy-item -Path $liveBackupFile.FullName -Destination "$TempLocalFolder\$liveBackupFile" -PassThru -Verbose
        }

        LogWrite "copying backup file $liveBackupFile finished"
    }
    else
    {
       LogWrite "backup file already exists: $TempLocalFolder\$liveBackupFile"
    }

    #----------------------------------------------------------

    LogWrite "connecting to $TargetSQLServer"

    $TargetSQLDatabase_temp = $TargetSQLDatabase + "_"+ (get-date).tostring("yy_MM_dd");
    
    $mdfCurrentPath = "$TargetSQLDatabaseDataRoot\$TargetSQLDatabase_temp.mdf"
    if(Test-Path $mdfCurrentPath)
    {
        LogWrite "database $TargetSQLDatabase_temp.mdf already exists!"
        return;
    }

    LogWrite "restoring database from file: $TempLocalFolder\$liveBackupFile to temp database $TargetSQLDatabase_temp ..."

    $DBLogicalFileName = $LiveSQLDatabase;
    $DBLogicalFileName_log = $LiveSQLDatabase + "_Log";

    $query = "RESTORE DATABASE [$TargetSQLDatabase_temp] FROM   
    DISK = N'$TempLocalFolder\$liveBackupFile' WITH  FILE = 1,  
    MOVE N'$DBLogicalFileName' TO N'$TargetSQLDatabaseDataRoot\"+$TargetSQLDatabase_temp+".mdf',   
    MOVE N'$DBLogicalFileName_log' TO N'$TargetSQLDatabaseDataRoot\"+$TargetSQLDatabase_temp+".ldf', 
    NORECOVERY,  NOUNLOAD,  REPLACE,  STATS = 10";


    SQLCMD -S $TargetSQLServer -E -Q $query 

    Start-Sleep -s 10

    #bring it to live from restoring mode
    $query = "RESTORE DATABASE "+$TargetSQLDatabase_temp+" WITH RECOVERY"
    SQLCMD -S $TargetSQLServer -E -Q $query 
    
    LogWrite "dropping old $TargetSQLDatabase database..."

    #drop current database
    $query = "ALTER DATABASE [$TargetSQLDatabase]
    SET OFFLINE WITH ROLLBACK IMMEDIATE
    DROP DATABASE [$TargetSQLDatabase]"
    SQLCMD -S $TargetSQLServer -E -Q $query 
    
    Start-Sleep -s 10
    
    LogWrite "renaming temp database $TargetSQLDatabase_temp to $TargetSQLDatabase ..."

    #rename temp database 
    $query = "ALTER DATABASE ["+$TargetSQLDatabase_temp+"] SET SINGLE_USER WITH ROLLBACK IMMEDIATE    
    ALTER DATABASE ["+$TargetSQLDatabase_temp+"] MODIFY NAME = [$TargetSQLDatabase]
    ALTER DATABASE [$TargetSQLDatabase] SET MULTI_USER"
    SQLCMD -S $TargetSQLServer -E -Q $query 

    LogWrite "recreating db permissions..."
    
    #recreate permissions
    $query = "ALTER AUTHORIZATION ON DATABASE::[$TargetSQLDatabase] TO [sa]
    use  [$TargetSQLDatabase]
    ALTER User [domain\Administrators] with LOGIN = [domain\Administrators]"

    #enable if used
    #SQLCMD -S $TargetSQLServer -E -Q $query 
           
    LogWrite "cleaning and shrinking database $TargetSQLDatabase ..."
    
    $query = "use $TargetSQLDatabase;
    ALTER DATABASE $TargetSQLDatabase SET RECOVERY SIMPLE;   
    truncate table dbo.TestTable;

    DBCC SHRINKFILE('$DBLogicalFileName_Log');
    DBCC SHRINKFILE('$DBLogicalFileName')"
    SQLCMD -S $TargetSQLServer -E -Q $query 
    
    
    LogWrite "optimizing table indexes..."
    
     $query = "USE [$TargetSQLDatabase];
        
        EXEC [dbo].[IndexOptimize]
		@IndexFragmentationMin  = 5,
        @IndexFragmentationRebuildMin  = 30,
		@skiplargetables = 0,
		@PrintInformationOnly = 0"
 
    #enable if used
    #SQLCMD -S $TargetSQLServer -E -Q $query 
    
    
    LogWrite "removing old database files..."
       
    #remove all old database files   
    $fileEntries = [IO.Directory]::GetFiles("$TargetSQLDatabaseDataRoot");
    $todayDatePart = (get-date).tostring("yy_MM_dd");

     foreach($fileName in $fileEntries) 
     { 
        if($fileName.tolower().contains("$TargetSQLDatabase".tolower()) -and -not($fileName.tolower().contains($todayDatePart)))
        {
           LogWrite $fileName
           #remove-item -path $fileName
        }
     }    
      
    LogWrite "Done!"

} 
catch { 
     $msg = "Unexpected Error. Error details: $_.Exception.Message"
     LogWrite $msg 
     Write-Host $msg -foreground Red 
}

Enjoy!
CopyLiveDBToTest

Artykuł Live database restoration to test server using Powershell pochodzi z serwisu Proxmedia.

]]>
Terminate background stored procedure session https://pxm-software.com/terminate-background-stored-procedure-session/ Thu, 05 Mar 2015 10:55:15 +0000 https://proxmedia.pl:99/en/?p=3917 If you are executing background stored procedure, for example when run asynchronously by the Service Broker, there is often a need to stop the process, especially when it is time consuming operation. In order to do that, we have to find the session id of the process we want and then use KILL function to […]

Artykuł Terminate background stored procedure session pochodzi z serwisu Proxmedia.

]]>
If you are executing background stored procedure, for example when run asynchronously by the Service Broker, there is often a need to stop the process, especially when it is time consuming operation. In order to do that, we have to find the session id of the process we want and then use KILL function to terminate it.

In order to display currently running queries let’s execute following.

 SELECT sqltext.TEXT as SQLText,
	req.session_id,
	req.database_id,
	req.status,
	req.command,
	req.cpu_time,
	req.total_elapsed_time
	FROM sys.dm_exec_requests req
	CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS sqltext

After we’ve seen how to display running queries, we can now create stored procedure allowing us to stop an session id related to the query name we want.

 CREATE PROCEDURE [dbo].[proc_com_KillBackgroundProcedure] (
 @DatabaseName nvarchar(50) = '',
 @ProcedureName nvarchar(50) = ''
)
AS

declare @SessionToBeKilled int = null;
declare @ProcessName nvarchar(max) = '';

declare dataset cursor for

select SQLText, session_id
from (SELECT sqltext.TEXT as SQLText,
	req.session_id,
	req.database_id,
	req.status,
	req.command,
	req.cpu_time,
	req.total_elapsed_time
	FROM sys.dm_exec_requests req
	CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS sqltext
	inner join master.dbo.sysdatabases m on m.dbid = database_id
	where m.name = @databasename and req.status = 'background' 
        and [TEXT] like '%' + @ProcedureName + '%'
	) as T

open dataset
fetch next from dataset into  @ProcessName, @SessionToBeKilled
while @@fetch_status = 0
begin
 
  EXEC('KILL ' + @SessionToBeKilled);
  print 'killed: ' + convert(varchar(max),@SessionToBeKilled);

fetch next from dataset into  @ProcessName, @SessionToBeKilled
end
close dataset
deallocate dataset

Above query simply gets the session id by the procedure name and then just kills the process. Be careful to not to kill an system query. I hope this has been useful for you 🙂

Artykuł Terminate background stored procedure session pochodzi z serwisu Proxmedia.

]]>
Using mssql Service Broker in Service-Oriented Architecture https://pxm-software.com/using-mssql-service-broker-in-service-oriented-architecture/ Mon, 02 Mar 2015 13:30:39 +0000 https://proxmedia.pl:99/en/?p=3954 Main principle of SOA is separation between the service interface and it’s implementation. In order to achieve that separation on the database level we can use Mssql Service Broker messaging. Using Service Broker allows us to send internal or external messages between or withing the mssql database. We can also use the massages to activate […]

Artykuł Using mssql Service Broker in Service-Oriented Architecture pochodzi z serwisu Proxmedia.

]]>
Main principle of SOA is separation between the service interface and it’s implementation. In order to achieve that separation on the database level we can use Mssql Service Broker messaging.

Using Service Broker allows us to send internal or external messages between or withing the mssql database. We can also use the massages to activate stored procedure and execute tasks or even external web service by invoking c# custom assembly method. Internal messaging lets us trigger asynchronous events that we can use to perform time consuming operations or integration tasks (for example by activation database trigger that starts the event chain).

In our example I will show you how to execute time consuming task asynchronously. It is especially useful when you need to calculate a lot of data when some table content changes (trigger events).

Below is the example of triggering time consuming procedure synchronously and asynchronously using Service Broker.

servicebroker

In order to implement our solution we need to first create self activating stored procedure that will be activated by service broker queue each time we send the message to.

 CREATE PROCEDURE proc_BrokerTargetActivProc
 AS
  DECLARE @RecvReqDlgHandle UNIQUEIDENTIFIER;
  DECLARE @RecvReqMsg NVARCHAR(100);
  DECLARE @RecvReqMsgName sysname;

  WHILE (1=1)
  BEGIN

    BEGIN TRANSACTION;

    WAITFOR
    ( RECEIVE TOP(1)
        @RecvReqDlgHandle = conversation_handle,
        @RecvReqMsg = message_body,
        @RecvReqMsgName = message_type_name
      FROM TASK_QUEUE
    ), TIMEOUT 5000;

    IF (@@ROWCOUNT = 0)
    BEGIN
      ROLLBACK TRANSACTION;
      BREAK;
    END
     
    declare @targetpProcedure nvarchar(100);
    set @targetpProcedure = CAST(@RecvReqMsg AS nvarchar(100))
    
    IF LEN(@targetpProcedure) > 0
    BEGIN
        --execute target procedure (no need to send the reply)
       exec @targetpProcedure;
     
    END
    ELSE IF @RecvReqMsgName =
        N'https://schemas.microsoft.com/SQL/ServiceBroker/EndDialog'
    BEGIN
       END CONVERSATION @RecvReqDlgHandle;
    END
    ELSE IF @RecvReqMsgName =
        N'https://schemas.microsoft.com/SQL/ServiceBroker/Error'
    BEGIN
       END CONVERSATION @RecvReqDlgHandle;
    END
      
    COMMIT TRANSACTION;

  END
GO

When we have the activation procedure ready, the next step is to create basic Service Broker objects we will use when sending the messages.

CREATE MESSAGE TYPE [//SBM/RequestMessage] VALIDATION=WELL_FORMED_XML; 
CREATE MESSAGE TYPE [//SBM/ReplyMessage] VALIDATION=WELL_FORMED_XML;

CREATE CONTRACT [//SBM/MSGContract]
(
 [//SBM/RequestMessage]
 SENT BY INITIATOR 
  ,[//SBM/ReplyMessage]
 SENT BY TARGET 
);  

CREATE QUEUE TASK_QUEUE
 WITH STATUS=ON,
 ACTIVATION (
  PROCEDURE_NAME = proc_BrokerTargetActivProc,
  MAX_QUEUE_READERS = 5,
  EXECUTE AS SELF);

CREATE SERVICE [//SBM/TargetService] ON QUEUE TASK_QUEUE([//SBM/MSGContract]);
CREATE SERVICE [//SBM/InitService] ON QUEUE TASK_QUEUE([//SBM/MSGContract]);

The final step is to create helper procedure that will be sending activating messages to the queue passing in the target procedure’s name to be executed.

CREATE PROCEDURE proc_ExecuteProcedureAsync
 @ProcedureName nvarchar(100) = ''
 AS

DECLARE @InitDlgHandle UNIQUEIDENTIFIER;
DECLARE @RequestMsg xml;
BEGIN TRANSACTION;
BEGIN DIALOG @InitDlgHandle
     FROM SERVICE
      [//SBM/InitService]
     TO SERVICE
      N'//SBM/TargetService'
     ON CONTRACT
      [//SBM/MSGContract]
     WITH
         ENCRYPTION = OFF;
 
 SELECT @RequestMsg = @ProcedureName;
       
--Send the Message
SEND ON CONVERSATION @InitDlgHandle MESSAGE TYPE [//SBM/RequestMessage](@RequestMsg);

--Close conversation
END CONVERSATION @InitDlgHandle;

COMMIT TRANSACTION;

I have included complete sql script below that will create all structures and procedures for you. Give it a try.
SetUpServiceBroker1

Artykuł Using mssql Service Broker in Service-Oriented Architecture pochodzi z serwisu Proxmedia.

]]>
Using automation when executing SQL scripts stored in separate files https://pxm-software.com/using-automation-when-executing-sql-scripts-stored-in-separate-files/ Mon, 02 Mar 2015 10:24:27 +0000 https://proxmedia.pl:99/en/?p=3914 When working on a big projects, sometimes there is a need to update database from the sql files scattered around multiple files or folders especially when using source control for maintaining database changes. Also when using auto builds via CI server this functionality seems to be useful. In this article I will show you how […]

Artykuł Using automation when executing SQL scripts stored in separate files pochodzi z serwisu Proxmedia.

]]>
When working on a big projects, sometimes there is a need to update database from the sql files scattered around multiple files or folders especially when using source control for maintaining database changes. Also when using auto builds via CI server this functionality seems to be useful.

In this article I will show you how to automate the process of updating multiple database sql files so you can do it in one go – like that:

  //this is command line executed by TeamCity build (change params with your build configuration values)
 c:\\UpdateScripts.exe "\scripts\Procedures" %Server% %Database% %ToBeReplacedSelectUser% %DatabaseSelectUser% "exec dbo.proc_GrantExecutePermissionsOnProcedures '%DatabaseUser%'"

teamCitySqlCmd

Lets create C# console application first. Once that is done we can implement basic validation rules for parameters being passed in. In our example we will support 6 parameters: Server, Database, StringToFind, StringToReplace, DatabaseUser, FinalQuery (we will use windows integrated database connection).

    if (args.Length == 0)
    {
        Console.WriteLine("No arguments passed!");
        Console.WriteLine(Environment.NewLine);
        Console.WriteLine(getHelp());
        Console.ReadLine();
        return 1;
    }

    if (args.Length < 3)
    {
        if (!silent)
        {
            Console.WriteLine("Not all required arguments passed!");
            Console.WriteLine(Environment.NewLine);
            Console.WriteLine(getHelp());
            Console.ReadLine();
        }
        return 1;
    } 
    ///////////////
    var scriptPath = args[0];
    var server = args[1];
    var database = args[2];
    var findString = args[3];
    var replaceString = args[4];
    var endQuery = args.Length >= 5 ? args[5] : string.Empty;
    /////////////

    if (!Directory.Exists(scriptPath) || Directory.GetFiles(scriptPath).Length == 0)
    {
        if (!silent)
        {
            Console.WriteLine("Directory does not exist or is empty!");
            Console.WriteLine(Environment.NewLine);
            Console.WriteLine(getHelp());
            Console.ReadLine();
        }
        return 1;
    }

The code below will get all .sql files from the directory and execute containing queries. If an file has “GO” statements inside then it needs to be executed in one block to avoid errors. Normally when executing such a scripts, Sql Management Studio cares about that. In our case we simply need to split content by “GO” blocks.

Console.WriteLine("params to find: " + findString);
Console.WriteLine("replace string: " + replaceString);

using (var con = new SqlConnection("Server=" + server + ";Database=" + database + ";Trusted_Connection=True;"))
{
    con.Open();

    var files = Directory.GetFiles(scriptPath).Where(f => f.ToLower().EndsWith(".sql"));

    foreach (var filePath in files)
    {
        #region execute query from file
        var query = File.ReadAllText(filePath);

        if (findString.Length > 0)
        {
            query = query.Replace(findString, replaceString);
        }
        Console.WriteLine("Executing: " + Path.GetFileName(filePath));

        query = query.Replace("go\r\n", "GO\r\n");
        query = query.Replace("go\t", "GO\t");
        query = query.Replace("\ngo", "\nGO");

        //divide into GO groups to avoid errors
        var itemQuery = query.Split(new string[] { "GO\r\n", "GO\t", "\nGO" }, StringSplitOptions.None);

        foreach (var queryItem in itemQuery)
        {
            if (queryItem.Trim().Length < 3) { continue; }

            using (var cmd = new SqlCommand(queryItem, con))
            {
                try
                {
                    cmd.ExecuteNonQuery();
                }
                catch (Exception ex)
                {
                    if (!silent)
                    {
                        Console.WriteLine("Error: " + ex.Message);
                    }
                    exitResult = 1;
                }
            }
        }
        #endregion
    }

At the end we can execute final sql statement. It is useful when for example there is a need to assign database permissions after the update is performed.

 //////////////////////execute final query (e.g permission update)
if (endQuery.Trim().Length > 3)
{
    using (var cmd = new SqlCommand(endQuery, con))
    {
        try
        {
            Console.WriteLine("Executing final query: " + endQuery);
            cmd.ExecuteNonQuery();
        }
        catch (Exception ex)
        {
            Console.WriteLine("Error: " + ex.Message);
            return 1;
        }
    }
}

The Console.WriteLine output will also be picked up by the TeamCity build log, so you can see all update information when needed. I have attached solution files, feel free to adjust it to your needs 🙂
UpdateDBScripts

Artykuł Using automation when executing SQL scripts stored in separate files pochodzi z serwisu Proxmedia.

]]>
MSSQL database backup/ restore helper scripts https://pxm-software.com/mssql-database-backup-restore-helper-scripts/ Wed, 11 Feb 2015 09:59:47 +0000 https://proxmedia.pl:99/en/?p=3896 When working with MSSQL server I have spotted some general problems during restoring databases. Below I’m presenting some common solutions that can save you development time. If you are restoring database backup taken from the same database but running on different server eg. restoring production database on your test server – you may get security […]

Artykuł MSSQL database backup/ restore helper scripts pochodzi z serwisu Proxmedia.

]]>
When working with MSSQL server I have spotted some general problems during restoring databases. Below I’m presenting some common solutions that can save you development time.

If you are restoring database backup taken from the same database but running on different server eg. restoring production database on your test server – you may get security error when trying to connect with the old DB user.
This is because DB user account was generated on different server where windows security token was different. The solution is to recreate the user on restored database:

//recreate user
ALTER User DB_user with LOGIN = DB_user;

//set password
ALTER LOGIN DB_user WITH PASSWORD = 'secret'

If you must overwrite you test database with the different one, you need to first close all current database connections, otherwise you may get an error saying that database is in use. The quick solution for that is the script below.
It simply sets database off-line and back on-line. This is sometimes much faster then waiting for all connections to be closed in other ways (note: use it only on the test server)

ALTER DATABASE [MyDB]
SET OFFLINE WITH ROLLBACK IMMEDIATE
ALTER DATABASE [MyDB]
SET ONLINE
GO

If you will get following error: “Cannot execute as the database principal because the principal “dbo” does not exist, this type of principal cannot be impersonated, or you do not have permission”. This is solution:

 ALTER AUTHORIZATION ON DATABASE::[myDatabase] TO [sa];

If your database backup takes a lot of space but you know that there is not too much data in it, you can simply shrink the log files before creating a backup.

Use MyDB
ALTER DATABASE MyDB
SET RECOVERY SIMPLE;
GO

DBCC SHRINKFILE ('MyDB_log',TRUNCATEONLY);
GO

ALTER DATABASE MyDB
SET RECOVERY FULL;

Finally, if you want to grant execute permissions to all stored procedures at once (on the restored database), you can use following script:

 CREATE PROCEDURE [dbo].[procExecuteOnProcs] (
   @User varchar(max)
 )
 AS
  declare procs cursor for
  select routine_schema, routine_name from information_schema.routines where routine_type = N'PROCEDURE'

  declare @Schema nvarchar(128), @Name nvarchar(128), @SQL nvarchar(512)

  open procs
  fetch next from procs into @Schema, @Name
  while @@fetch_status = 0
  begin
    set @SQL = N'grant execute on [' + @Schema + N'].[' + @Name + N'] to ' + @User
    exec sp_ExecuteSQL @SQL
    fetch next from procs into @Schema, @Name
  end
  close procs
  deallocate procs

Hope these simple tricks will help you. Please note that you can also automate above scripts to be part of other processes.

Artykuł MSSQL database backup/ restore helper scripts pochodzi z serwisu Proxmedia.

]]>
Finding non empty data periods T-SQL https://pxm-software.com/finding-non-empty-data-periods-t-sql/ Thu, 15 Jan 2015 22:41:09 +0000 https://proxmedia.pl:99/en/?p=3771 Ever wanted to find non empty periods of data using T-SQL? I this article I will show you how to do it. Example below shows the periods of days in which there have been at least one order. The days without orders are not being shown. This is very useful when creating advanced sales reports. […]

Artykuł Finding non empty data periods T-SQL pochodzi z serwisu Proxmedia.

]]>
Ever wanted to find non empty periods of data using T-SQL? I this article I will show you how to do it.
Example below shows the periods of days in which there have been at least one order. The days without orders are not being shown. This is very useful when creating advanced sales reports. Having non empty periods we can calculate additional values such as average sales in these ranges etc.

The code below simply gets all distinct dates then creates two partitions and joins them together.
In each partition the data are joined with each other using left and right join. The non null values are next being used in the final inner join.

 with temp(d) as 
 ( 
  select distinct CONVERT(date,OrderDate) d from tblOrders
 )
 select a1.id as id, a1.d as [from], a2.d as [to] from 
 ( 
   select b.d, ROW_NUMBER() over (partition by 1 order by b.d) id from temp a 
   right join temp b on dateadd(day,1,a.d) = b.d 
   where a.d is null 
 )
 a1 
 inner join  
 ( 
   select a.d, ROW_NUMBER() over (partition by 1 order by b.d) id from temp a 
   left join temp b on dateadd(day,1,a.d) = b.d 
   where b.d is null 
 ) 
 a2 on a1.id = a2.id 

sql-periods1

In order to calculate sales data in periods, simply join the final results using found dates. Hope I helped you with this sample. Happy coding 🙂

Artykuł Finding non empty data periods T-SQL pochodzi z serwisu Proxmedia.

]]>