Auto publishing reports to Tableau Server

Tableau it’s a great tool for data visualization, however if you are using it a lot, you may want to automate some stuff. One of them is publishing/updating reports to Tableau Server. This is when Tableau Utility Command comes in handy, you can install it on your development server and use in the power-shell script. One of the solutions is to use CI server for auto deployments, you only need to give the git/svn access for users changing the reports or adding new ones.

tableau

Following script can be run as build step in TeamCity to detect workbooks that have been changed recently and publish them automatically to Tableau server. Parent folder of each workbook will be used as project name when publishing. In order to run it, just pass in email notification list and server password – of course you need to configure the params (server url, smtp etc.).

param (
[string]$cmddir = "C:\Program Files\Tableau\Tableau Server\8.2\extras\Command Line Utility", #location where tabcmd has been installed
[string]$server = "https://tableau:81", #this is url of the Tableau server 
[string]$currentDir = (split-path -parent $MyInvocation.MyCommand.Definition) +"\", #current script location
[string]$notificationEmailList = "test1@test.com,test2@test.com", #send email notifications if successful
[string]$admin = "user", #admin account for the server
[string]$pass = "" #to be passed in as param
)
 
function SendEmail($emailTo,$title,$body)
{ 
   $smtp=new-object Net.Mail.SmtpClient("my_smtp_server"); $smtp.Send("sentAs@mydomain.com", $emailTo, $title, $body);
}
 
$global:temp_ = "";
 
#login to Tableau
cd  $cmddir
.\tabcmd login -s $server -u $admin -p $pass
 
 get-childitem -Path $currentDir –recurse |  where-object { 
    $_.LastWriteTime -gt (get-date).AddMinutes(-10) -and $_.FullName.EndsWith(".twb")
  } | 
  Foreach-Object {
 
       [string]$projectName = [System.IO.DirectoryInfo]$_.Directory.Name;
        $global:temp_ += [string][System.IO.Path]::GetFileName($_.FullName) + " | ";
 
       #publish or overwrite workbook on the server
       .\tabcmd publish $_.FullName -r $projectName  -o  
  } 
 
 
#more commands
#.\tabcmd publish "workbook.twbx" -r "project name" -n "Workbook Name" --db-user "" --db-password "" -o
 
 
#log out to release the session
.\tabcmd logout
 
if(-not $global:temp_ -eq "")
{
   SendEmail $notificationEmailList "Tableau report published" "Following report(s) has just been successfully published to Tableau Server: $global:temp_"
}

enjoy!

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...Loading...

SQL Server Database Index optimization – maintenance plan

Properly optimized table indexes are crucial in database performance tuning. This is especially true when the database is growing overtime causing indexes to be fragmented. The more an index is fragmented, the longer time is needed for database reads.

When should the indexes be rebuilt? Microsoft recommends to reorganize indexes when fragmentation level is between 5% and 30%. When fragmentation is above 30% then the index should be rebuilt.

The best way of ensuring low fragmentation levels is to setup maintenance plan for the database that will be triggered periodically (preferably at night). The script below can be used in scheduled job. You can use @PrintInformationOnly param to print the index information first, for all database tables. When you set it to 0, the actual rebuild or reorganize process will start. I have also added @ExcludeTableList to exclude some tables that should be skipped (for example are too big or not used).

DECLARE	@return_value int

EXEC	@return_value = [dbo].[IndexOptimize]
		@databaseName = N'myDatabase',
		@IndexFragmentationMin = 5,
		@IndexFragmentationRebuildMin = 30,
		@PrintInformationOnly = 1

SELECT	'Return Value' = @return_value

The actual script starts here. You may want to adjust FILLFACTOR params separately for each tables as this should be configured depending on data update frequency. By setting MAXDOP= 1 param, we disable parallel threads during rebuild process, so the server CPU will not go to high.

I have successfully tested this script on database of size about 150 GB. Be aware that initial index rebuild may take a few hours, depending on fragmentation levels.

 
CREATE PROCEDURE [dbo].[IndexOptimize] 
@databaseName varchar(200) = 'MyTable',
@IndexFragmentationMin int = 5,
@IndexFragmentationRebuildMin int = 30,
@skipLargeTables bit = 0,
@ExcludeTableList nvarchar(max) = 'MyBigTable1,MyBigTable2',
@PrintInformationOnly bit = 0
AS
BEGIN

BEGIN TRY
	declare @dbid int = 0;
	set @dbid = (select [dbid] from [Master].dbo.[sysdatabases] where name = @databaseName);
    ------------------
	DECLARE @Schema VARCHAR(200);  
	DECLARE @Table NVARCHAR(200);  
	DECLARE @Index VARCHAR(200);  
	DECLARE @avg_fragmentation_in_percent int; 
	DECLARE @page_count int; 

	DECLARE db_cursor CURSOR FOR  
		SELECT 
		dbschemas.[name] as 'Schema', 
		dbtables.[name] as 'Table', 
		dbindexes.[name] as 'Index',
		indexstats.avg_fragmentation_in_percent,
		indexstats.page_count
		FROM sys.dm_db_index_physical_stats (@dbid, NULL, NULL, NULL, NULL) AS indexstats
		INNER JOIN sys.tables dbtables on dbtables.[object_id] = indexstats.[object_id]
		INNER JOIN sys.schemas dbschemas on dbtables.[schema_id] = dbschemas.[schema_id]
		INNER JOIN sys.indexes AS dbindexes ON dbindexes.[object_id] = indexstats.[object_id]
		AND indexstats.index_id = dbindexes.index_id
		WHERE dbindexes.[name] is not null and indexstats.database_id = @dbid  
		ORDER BY indexstats.avg_fragmentation_in_percent desc

	OPEN db_cursor   
	FETCH NEXT FROM db_cursor INTO @Schema, @Table, @Index, @avg_fragmentation_in_percent, @page_count 

	WHILE @@FETCH_STATUS = 0   
	BEGIN  
	   if(@skipLargeTables = 1 and CHARINDEX(@Table, @ExcludeTableList) > 0)
	   begin
		 print 'skipped index (' + CONVERT(varchar(10),@avg_fragmentation_in_percent) + '%) ' + @Index + ' for table ' + @Table + ', page count ' + CONVERT(varchar(10),@page_count)
	   end
	   else
	   begin
			if(@avg_fragmentation_in_percent <= @IndexFragmentationRebuildMin and @avg_fragmentation_in_percent > @IndexFragmentationMin and LEN(@Index) > 3)
			begin
			   print 'reorganizing index (' + CONVERT(varchar(10),@avg_fragmentation_in_percent) + '%) ' + @Index + ' for table ' + @Table + ', page count ' + CONVERT(varchar(10),@page_count)
			  
			   if(@PrintInformationOnly = 0)
			   begin
			      exec('ALTER INDEX ' + @Index + ' ON dbo.' + @Table + ' REORGANIZE;');
			   end
			end
		    
			if(@avg_fragmentation_in_percent > @IndexFragmentationRebuildMin and LEN(@Index) > 3)
			begin
			   print 'rebuilding index (' + CONVERT(varchar(10),@avg_fragmentation_in_percent) + '%) ' + @Index + ' for table ' + @Table + ', page count ' + CONVERT(varchar(10),@page_count)
			  
			   if(@PrintInformationOnly = 0)
			   begin
			      exec('ALTER INDEX ' + @Index + ' ON dbo.' + @Table + ' REBUILD WITH (MAXDOP=1,FILLFACTOR = 90)');
			   end
			end
		end
			
		FETCH NEXT FROM db_cursor INTO @Schema, @Table, @Index, @avg_fragmentation_in_percent, @page_count  
	END   

	CLOSE db_cursor   
	DEALLOCATE db_cursor
	
	END TRY 
BEGIN CATCH	
	DECLARE @ErrorMessage NVARCHAR(4000);
	DECLARE @ErrorSeverity INT;
	DECLARE @ErrorState INT;
	SELECT @ErrorMessage = 'Message: '+ ERROR_MESSAGE() + CHAR(13) + CHAR(10) + 
						   'Procedure: ' + CAST(ERROR_PROCEDURE() AS VARCHAR(MAX)) + CHAR(13) + CHAR(10) +  
						   'Line: ' + CAST(ERROR_LINE() AS VARCHAR(MAX)) + CHAR(13) + CHAR(10) +  
						   'Number: ' + CAST(ERROR_NUMBER() AS VARCHAR(MAX)),  @ErrorSeverity = ERROR_SEVERITY(), @ErrorState = ERROR_STATE();	

	RAISERROR (@ErrorMessage, @ErrorSeverity, @ErrorState);
	RETURN 0
END CATCH

END

Microsoft-SQL-Server

1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 4.00 out of 5)
Loading...Loading...

Deleting old files by using schedule task and PowerShell

If you are administrator or programmer, you probably were in situation that you wanted to free up some server space by deleting old unused files. Following script may be quite useful if you want to get rid of old files of the certain type – also if they are packed into .zip packages.

Script checks for all predefined file types with the creation date older than 6 months (can be configured). If an old file is detected then it is being deleted and that information is being stored to the local log.txt file. The old .zip files will also be deleted (containing only the files we want to delete).

The recommended way of using it is to create windows schedule task that runs periodically, so the server space usage is always optimized.

delete_old_files

 param (
[int]$TestMode = 1,
[int]$DeleteOlderThanMonths = 6,
[string]$DeleteFileExtension = ".jpg",
[string]$Root_Folder = "C:\",
[string]$Logfile = "Log.txt"
)

#writes log messages
Function LogWrite
{
   Param ([string]$logstring)
   $currentdate = get-date -format "M-d-yyyy HH:MM"
   Add-content $Logfile -value "$currentdate | $logstring"
}

#checks if zip file contains only DeleteFileExtension file type
Function CanZIPbeDeleted($zipFilePath)
{
  [int]$canBeDeleted = 1;

  try { 
      $zipArchive = Get-Item -Path $zipFilePath | ForEach-Object { 
         (New-Object -ComObject Shell.Application).NameSpace($_.FullName)
      }
  
      foreach($item in $zipArchive.items())
      {
        if(!$item.Path.EndsWith($DeleteFileExtension))
        {
           $canBeDeleted = 0
           $msg = "$zipFilePath cannot be deleted as it contains non $DeleteFileExtension file"
           LogWrite $msg
           Write-Host $msg -foreground Green 
           return 0
        }
     }   
   } 
   catch { 
       $msg = "Unexpected Error. Error details: $_.Exception.Message"
       LogWrite $msg
       Write-Host $msg -foreground Red 
       return 0
   } 
 
    return $canBeDeleted;
}

#deletes the file
Function DeleteFile($filePathToDelete)
{
  try
  { 
     if($TestMode -eq 0)
     {
       Remove-Item $filePathToDelete
     }
     
     $size = (Get-Item -Path $filePathToDelete | Select-Object Name, @{Name="Size";Expression={[string]::Format("{0:0.00} MB", $_.Length / 1MB)}}).Size
     $msg = "File deleted ($size): $filePathToDelete"
     LogWrite $msg 
     Write-Host $msg -foreground "magenta"  
  } 
  catch { 
     $msg = "Unexpected Error. Error details: $_.Exception.Message"
     LogWrite $msg 
     Write-Host $msg -foreground Red 
   }    
}

 
 get-childitem -Path $Root_Folder –recurse |  
 where-object { 
    $_.CreationTime -lt (get-date).addMonths(-$DeleteOlderThanMonths) -and ($_.FullName.EndsWith($DeleteFileExtension) -or $_.FullName.EndsWith(".zip"))
  } | 
 Foreach-Object {
 
    if ($_.FullName.EndsWith(".zip"))
    {
       $canDeleteZip = CanZIPbeDeleted $_.FullName
       if($canDeleteZip -eq 1)
       {
          DeleteFile $_.FullName
       }
    }
    else
    {
       DeleteFile $_.FullName
    }
 } 

Enjoy!
delete_old_files

1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 4.00 out of 5)
Loading...Loading...

SVN log parsing using PowerShell

This script can be useful when gathering SVN release changes that occurred since the last release. It connects to SVN server, gets logs from within date range (last build and date now). Logs are then filtered, cleaned (removed if no comments were added). Next, email message is sent based on the retrieved data. To be able to run the script, you need to configure it by setting-up your own svn server url, credentials, also please make sure you have enabled running ps scripts on your server.

Enjoy!

svn_log_parser

 param (
[string]$Version = "2.12.0.2",
[string]$SendEmailsTo = "test@test.com",
[string]$SVNUser = "user",
[string]$SVNPass = "pass",
[string]$SVNbranchRoot = "https://svn.test.com/www/Branches/MyApp/2.12.0"
)

#get svn logs for the current release
 Function GetSVNLogs($lastBuildDate_)
 {
    [string]$fromDate = $lastBuildDate_.ToString("yyyy-MM-dd");
    [string]$toDate = (get-date).ToString("yyyy-MM-dd");
    [string]$RevisionDates = "{$fromDate}:{$toDate}"; 

    #add -v param for verbose
    1$log = 1(&$svn log $SVNbranchRoot -r $RevisionDates --limit 500 --xml --username $SVNUser --password $SVNPass);

    $logObjects = $log.log.logentry | Foreach-Object {
            $logEntry = $_

            $logEntry | Select-Object `
                @{ Name = "Revision"; Expression = { [int]$logEntry.revision } },
                @{ Name = "Author"; Expression = { $logEntry.author } },
                @{ Name = "Date"; 
                   Expression = {
                       if ( $NoFormat )
                       {
                           [datetime]$logEntry.date
                       }
                       else
                       {
                           "{0:dd/MM/yyyy hh:mm:ss}" -f [datetime]$logEntry.date
                       }
                   } },
                @{ Name = "Message"; Expression = { $logEntry.msg } } | 

            Foreach-Object {        
                $_ | where-object { ![string]::IsNullOrEmpty($logEntry.msg) }  | Select-Object  Author, Date, Message              
            }
      }

    return $logObjects
 }

#send email function
 Function SendEmail($SendEmailsTo_, $EmailSubject_, $changes_)
 {
   $emailFrom = "automation@test.com" 
   $smtpserver="smtp.test.com" 
   $smtp=new-object Net.Mail.SmtpClient($smtpServer) 

   foreach ($email in $SendEmailsTo_.split(';'))
   {   
      $smtp.Send($emailFrom, $email, $EmailSubject_, $changes_)
   }
 }

#get svn logs since the last release
$SVNChanges = GetSVNLogs $lastBuildDate

$changes += "`r`n`r`n";
$changes += "SVN changes sice last release`r`n";
$changes += $SVNChanges | Format-Table -AutoSize | Out-String;
$changes += "`r`n-------------------------------------------------------------------`r`n";
$changes += "This is automated email, please do reply directly!";

#send email 
$EmailSubject = "Release $Version changes";

SendEmail $SendEmailsTo $EmailSubject $changes;

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...Loading...

PowerShell unattended execution with auto-retry and SMS alerts

Running automated processes seems to be an very easy task. However, this may become a little bit more complicated for running critical unattended operations. Imagine the situation when you have to run critical, automated process at night while your business relies on successful completion of it.

For example, if you are running ETL process at night that loads the data to be available for the users next morning. You want to be warned if something is wrong as you won’t have enough time to load it next morning say before 9am. During the process, you may get either data or the infrastructure related error – one of the servers may be down for a few seconds causing entire operation to fail.
Automated-execution
The script below resolves some of the potential problems that may occur in similar scenario by auto-retrying processes on error and sending SMS alerts directly to your mobile, so you will probably be alerted much earlier unless you read emails at night 🙂

 

In the script, you can define multiple input files to be run by the process you define. For example you may define multiple dtx packages to be run one by one. Whatever the process will run, you need to make sure that it will return 0 on successful completion or return non zero integer on error. If the exe process will not close itself returning the exit code, then it will hang forever.

Except for input files and exe process, you also need to define execution wait times, email recipient list, smtp server and database connection if you need to log audit information. If you want to send SMS alerts, you will also need to create account on txtlocal.co.uk.

For this example I have defined three csv files (data1.csv, data2.csv, data2.csv) to be run in notepad (with auto retry twice for each of them), closing notepad manually will simulate successful completion. Of course you need to adjust it to your requirements.

 

 param (
[string[]]$inputFiles = @(
#add paths to more files if needed here
#format: input file path;NumberOfRetries,
#########################################

"c:\data1.csv;2",
"c:\data2.csv;2",
"c:\data3.csv;2",

#########################################
""),
[string]$exePath = "C:\Windows\System32\notepad.exe",
[string]$LoadSuccessfulNotificationList = "email1@mydomain.com;email2@mydomain.com",
[string]$ErrorNotificationList = "email1@mydomain.com;email2@mydomain.com",
[int]$RetryWaitTimeSec = 60,
[int]$NextFileExecWaitTimeSec = 3,
[string]$connString = "Data Source=myServer,1433;Initial Catalog=MyDatabase;integrated security=SSPI;persist security info=False;Trusted_Connection=Yes"
)

[System.Reflection.Assembly]::LoadWithPartialName("System.Web") | out-null
[System.Reflection.Assembly]::LoadWithPartialName("System.Web.Extensions") | out-null

##define phone numbers to send error alerts to (format: 441234567890,441234567892)
$global:numbers = [System.Web.HttpUtility]::UrlEncode("441234567890");

#####################no need to edit below

$global:LogInfo = "";
$global:SmsLogInfo = "";
$global:errorFlag = 0;
[int]$firstFlag = 1;


function SendSms([string]$smsMessage)
{
    if($smsMessage.trim().Length -eq 0) { return;}
    if($smsMessage.trim().Length -gt 155) 
    {
      $smsMessage = $smsMessage.trim().Substring(0,150) + "...";
    }

    $username = [System.Web.HttpUtility]::UrlEncode("mail@mydomain.com");
    $hash = [System.Web.HttpUtility]::UrlEncode("my hash - get it from txtlocal.co.uk");##if not working please check credentials at https://control.txtlocal.co.uk/docs/
    ###
    $message = [System.Web.HttpUtility]::UrlEncode($smsMessage);
    $sender = [System.Web.HttpUtility]::UrlEncode("My alerts");
    ###
    $data = 'username=' + $username + '&hash=' + $hash + '&numbers=' + $global:numbers + "&sender=" + $sender + "&message=" + $message;
    $response = Invoke-WebRequest -Uri "https://api.txtlocal.com/send/?$data";
    ###
    $ser = New-Object System.Web.Script.Serialization.JavaScriptSerializer;
    $result = $ser.DeserializeObject($response.Content);

    return $result;
}

Function LogToDatabase($exitCode,$inputFilePath,$duration,$cpu,$description)
{
    $inputFileName = [System.IO.Path]::GetFileName($inputFilePath);

    $SqlConnection = new-object System.Data.SqlClient.SqlConnection;
    $SqlConnection.ConnectionString = $connString;
    $SqlCommand = $SqlConnection.CreateCommand();
    $SqlCommand.CommandType = [System.Data.CommandType]::StoredProcedure;
    $SqlCommand.CommandText = "MonitorAudit_Save";

    $SqlCommand.Parameters.AddWithValue("@PCName", [Environment]::MachineName + " - " + [Environment]::UserDomainName + "/" + [Environment]::UserName) 
    $SqlCommand.Parameters.AddWithValue("@ExePath", $exePath) | Out-Null

    $SqlCommand.Parameters.AddWithValue("@ExitCode", $exitCode) | Out-Null
    $SqlCommand.Parameters.AddWithValue("@InputFilePath", $inputFilePath) | Out-Null
    $SqlCommand.Parameters.AddWithValue("@InputFileName", $inputFileName) | Out-Null

    $SqlCommand.Parameters.AddWithValue("@Duration", $duration) | Out-Null

    $SqlCommand.Parameters.AddWithValue("@CPU", $cpu) | Out-Null
    $SqlCommand.Parameters.AddWithValue("@Description", $description) | Out-Null

    $SqlConnection.Open()
    $SqlCommand.ExecuteNonQuery()
    $SqlConnection.Close()
}

 Function SendEmail($SendEmailsTo_, $EmailSubject_, $changes_)
 {
   $emailFrom = "errors@mydomain.com" 
   $smtpserver="mysmtp_server" 
   $smtp=new-object Net.Mail.SmtpClient($smtpServer) 
 
   foreach ($email in $SendEmailsTo_.split(';'))
   {   
      if($email.Length -gt 0)
      {
        $smtp.Send($emailFrom, $email, $EmailSubject_, $changes_);
      }
   }
 }
 
$inputFiles | Foreach-Object {
   
  if($_.trim().Length -gt 0)  {

      [int]$numOfRetries = 1;
      if($_.split(';').Length -gt 1) {
        $numOfRetries = $_.split(';')[1]; if($numOfRetries -lt 1) { $numOfRetries = 1 }
      }

      if($firstFlag -eq 0 -and $NextFileExecWaitTimeSec -gt 0){
        $global:LogInfo += "Waiting $NextFileExecWaitTimeSec seconds to execute next input file...`r`n`r`n";
        Start-Sleep -s $NextFileExecWaitTimeSec
      }
      if($firstFlag -ne 0) { $firstFlag = 0;}

      for ($i = 0; $i -le $numOfRetries -1; $i++) {
       try
        {
           [string]$inputFilePath = $_.split(';')[0]; "processing $inputFilePath ..."; $dateStart = (Get-Date).ToString(); [DateTime]$dateStartObj = (Get-Date);

           if(-not(Test-Path $inputFilePath))
           {
              throw New-Object System.Exception ("Input file not found: $inputFilePath");
           }

           if(-not(Test-Path $exePath))
           {
              $global:errorFlag =1;
              throw New-Object System.Exception ("Process exe not found: $exePath");
           }

           #add -windowstyle Hidden when in production
           $exeProcess= Start-Process $exePath -Wait -PassThru -ErrorAction SilentlyContinue  -WarningAction SilentlyContinue -ArgumentList " `"$inputFilePath`"" 
           

           #log info
           $cpu = $exeProcess.CPU.ToString("##.#"); $dateEnd = (Get-Date).ToString(); $global:LogInfo += " $inputFilePath `r`nAverage CPU: $cpu | Start time $dateStart | End time $dateEnd`r`n`r`n";

           if(-not $exeProcess.ExitCode -eq 0)
           {
              throw New-Object System.Exception ("execution error");
           }
           
           #uncomment below line to log to database
           #LogToDatabase $exeProcess.ExitCode $inputFilePath (Get-Date).Subtract($dateStartObj).TotalSeconds $exeProcess.CPU.ToString("##.#") "OK" | Out-Null

           break;
        } 
        catch { 
           $global:errorFlag =1;
          try
           {
           $inputFileName = [System.IO.Path]::GetFileName($inputFilePath);
           } catch { $inputFileName = ""; }

           $msg = "$LogInfo Unexpected Error when processing $inputFileName. Error: $_";  Write-Host $msg -foreground Red;
           $global:LogInfo += "###########`r`nUnexpected Error when processing $inputFileName`r`n" + "Error: $_ `r`n##########`r`n`r`n";
 
           $global:SmsLogInfo += "Error processing: " + $inputFileName + " |"; 

           $cpu = "0";
           if ($exeProcess -ne $null) { $cpu = $exeProcess.CPU.ToString("##.#"); }

           #uncomment below line to log to database
           #LogToDatabase 1 $inputFilePath (Get-Date).Subtract($dateStartObj).TotalSeconds $cpu "Error: $_" | Out-Null

           #SendEmail $ErrorNotificationList "loading error" $msg
           if($i -gt $numOfRetries - 1 -or $i -eq $numOfRetries -1) { break;};

           #retry it
           $global:LogInfo += "Waiting $RetryWaitTimeSec seconds to retry...`r`n`r`n";
           Start-Sleep -s $RetryWaitTimeSec 
        } 
      }
   }
} 

if($errorFlag -eq 0) {
   SendEmail $LoadSuccessfulNotificationList "Loading process successful" "Processing has been successful for following input files: `r`n-------------------------`r`n$LogInfo Have a nice day :)";
}
else
{
   SendEmail $ErrorNotificationList "Loading process finished with errors" "Processing has NOT been successful for some of the input files: `r`n-------------------------`r`n$LogInfo";
   SendSms $global:SmsLogInfo;
}

Enjoy!
Automated-execution

1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 5.00 out of 5)
Loading...Loading...

Using automation for gathering release changes

If you need to gather release changes when creating installers, the script below can help you. Script simply gets all or current database release changes stored in file. I use it to automatically gather the latest changes after TeamCity builds release installer. The changes could be sent to PR for documentation. You can of course use more complicated patterns when extracting script text when required. To execute it, just pass in update script location, version number and email addresses.

release_changes

 

 param (
 [string]$ScriptFolder = "C:\Database\Updates",
 [string]$Version = "2.11.0.0",
 [string]$SendEmailsTo = "test@test.com",
 [boolean]$GetAllChanges = 0
)

#sends notification emails
 Function SendEmail($SendEmailsTo_, $EmailSubject_, $changes_)
 {
   $emailFrom = "teamcity@test.com" 
   $smtpserver="smtp.test.com" 
   $smtp=new-object Net.Mail.SmtpClient($smtpServer) 

   foreach ($email in $SendEmailsTo_.split(';'))
   {   
      $smtp.Send($emailFrom, $email, $EmailSubject_, $changes_)
   }
 }

 #get file by version number
 Function GetReleaseFileName($Version_)
 {
    $VersionFilename_ ="v";
     $Version_.split('.') | ForEach-Object{
        $VersionFilename_ +=  [System.Convert]::ToDouble($_).ToString("00"); 
    }
    $VersionFilename_ += ".sql";

     #format e.g. v12.12.00.10.sql
    return $VersionFilename_;
 }

#return if no emails added
if($SendEmailsTo.trim() -eq "") { "no emails defined!"; return;}

$VersionFilename =  GetReleaseFileName $Version;

$EmailSubject = "Release $Version database changes";
$changes = "Changes in release file $VersionFilename `r`n `r`n";
$hasChanges = 0;

if(!$GetAllChanges) {  
    (dir $ScriptFolder) | sort CreationTime -Descending |
     ForEach-Object {  
        $fileContent = get-content ($_.Directory.FullName + "\" + $VersionFilename) -Raw;

        $patterns = 'alter','drop','insert','delete','update','create'; #paterns/strings to find
        $hasChanges = 0;
        foreach ($pattern in $patterns)
        {
           $fileContent.split("`r`n") | ?  { $_ -match $pattern } | % { $changes += $_.trim() + "`r`n"; $hasChanges = 1; } 
        } 

        if($hasChanges -eq 0) { $changes += "`r`nNo database changes for release $Version !"}

        write-output $changes

        SendEmail $SendEmailsTo $EmailSubject $changes;

        break;        
    }
}
else
{
  #parse all files for all previous releases
  $EmailSubject = "All database changes";
  $changes = "All database changes: `r`n `r`n";

  (dir $ScriptFolder) | sort CreationTime -Descending |
    ForEach-Object {  
        $fileContent = get-content ($_.Directory.FullName + "\" + $_.name) -Raw;

        $patterns = 'alter','drop','insert','delete','update','create';
        $changes += "`r`nChanges in release file $_`r`n";
        $hasChanges = 0;

        foreach ($pattern in $patterns)
        {          
           if($fileContent)
           {
              $fileContent.split("`r`n") | ?  { $_ -match $pattern } | % { $changes += $_.trim() + "`r`n"; $hasChanges = 1; }  
           } 
        } 

        if($hasChanges -eq 0) { $changes += "`r`nNo database changes for release " + $_.name.Replace(".sql","").Replace("v0","");}

        $changes += "`r`n-------------------------------`r`n";      
    }

    write-output $changes 

    SendEmail $SendEmailsTo $EmailSubject $changes;
}

I have included full script sample below
release_changes_notifier

1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 4.00 out of 5)
Loading...Loading...

Getting data from TeamCity Rest API – PowerShell

Simple script that gets build data from TeamCity server rest api. When executed as a build step, you can pass in current build id to it. Having that, we can get the previous build and it’s data. Having that data we can proceed with further processing which can be for example gathering documentation, statistics etc. Please make sure you have enabled running ps scripts on your server.

Enjoy!

teamcity_api

 param (
[int]$BuildId = 2274,
)

#gets last build date by build id
Function GetLastBuildDateByBuildId($LastBuildId_)
{
  $secpasswd = ConvertTo-SecureString "mypassword" -AsPlainText -Force
  $cred = New-Object System.Management.Automation.PSCredential("username", $secpasswd)

  $build = (Invoke-RestMethod -Credential $cred -Uri "https://teamcity:88/httpAuth/app/rest/builds/id:$LastBuildId_").build;
  return [DateTime]::ParseExact($build.startdate.split("T")[0],'yyyyMMdd', $null) 
}

#get last build date
$lastBuildDate = GetLastBuildDateByBuildId  ($BuildId -1)

$lastBuildDate
1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...Loading...

MSSQL database backup/ restore helper scripts

When working with MSSQL server I have spotted some general problems during restoring databases. Below I’m presenting some common solutions that can save you development time.

If you are restoring database backup taken from the same database but running on different server eg. restoring production database on your test server – you may get security error when trying to connect with the old DB user.
This is because DB user account was generated on different server where windows security token was different. The solution is to recreate the user on restored database:

//recreate user
ALTER User DB_user with LOGIN = DB_user;

//set password
ALTER LOGIN DB_user WITH PASSWORD = 'secret'

If you must overwrite you test database with the different one, you need to first close all current database connections, otherwise you may get an error saying that database is in use. The quick solution for that is the script below.
It simply sets database off-line and back on-line. This is sometimes much faster then waiting for all connections to be closed in other ways (note: use it only on the test server)

ALTER DATABASE [MyDB]
SET OFFLINE WITH ROLLBACK IMMEDIATE
ALTER DATABASE [MyDB]
SET ONLINE
GO

If you will get following error: “Cannot execute as the database principal because the principal “dbo” does not exist, this type of principal cannot be impersonated, or you do not have permission”. This is solution:

 ALTER AUTHORIZATION ON DATABASE::[myDatabase] TO [sa];

If your database backup takes a lot of space but you know that there is not too much data in it, you can simply shrink the log files before creating a backup.

Use MyDB
ALTER DATABASE MyDB
SET RECOVERY SIMPLE;
GO

DBCC SHRINKFILE ('MyDB_log',TRUNCATEONLY);
GO

ALTER DATABASE MyDB
SET RECOVERY FULL;

Finally, if you want to grant execute permissions to all stored procedures at once (on the restored database), you can use following script:

 CREATE PROCEDURE [dbo].[procExecuteOnProcs] (
   @User varchar(max)
 )
 AS
  declare procs cursor for
  select routine_schema, routine_name from information_schema.routines where routine_type = N'PROCEDURE'

  declare @Schema nvarchar(128), @Name nvarchar(128), @SQL nvarchar(512)

  open procs
  fetch next from procs into @Schema, @Name
  while @@fetch_status = 0
  begin
    set @SQL = N'grant execute on [' + @Schema + N'].[' + @Name + N'] to ' + @User
    exec sp_ExecuteSQL @SQL
    fetch next from procs into @Schema, @Name
  end
  close procs
  deallocate procs

Hope these simple tricks will help you. Please note that you can also automate above scripts to be part of other processes.

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...Loading...

Executing sql scripts in direcory using .bat file

If you are in situation where you have to execute all sql scripts located as separate files in one folder, executing .bat file is the one of the options you can use. Executing batch command is quite flexible solution as you don’t have to recompile it every time the script will change.

The code below loops through all files within the specified directory, loads each of them into variable and replaces the param string if exists before execution. It is helpful if you need to replace database specific users (e.g. execute with ‘username’) etc.

Please note that variables within the loop are accessed by !myVar! instead of %myVar% because of delayed expansion applied.

Command cd /d %1 sets the current directory to the specified by the user in first parameter.

 @ECHO OFF
 echo script folder: %1
 echo server: %2
 echo database: %3
 echo database user: %4
 echo ----------------

setlocal enabledelayedexpansion
cd /d %1

for %%d in (*.sql) do (
   set filecontent=
   for /f "delims=" %%a in (%%d) do (set "filecontent=!filecontent!%%a")

   set "temp=!filecontent!"

   set "filecontent=!temp:#PARAM#=%4!"

   echo executing %%d
   echo ----------
   sqlcmd -S%2 -E -Q"!filecontent!" -d%3 
)
pause

Above code can be executed as follows (trusted database connection is used)

 C:\updateDB.bat "C:\scripts\" "myServer" "databaseName" PARAM_TO_REPLACE
1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...Loading...