powershell – Proxmedia https://pxm-software.com Software Outsourcing & Nearshore Development Mon, 27 Feb 2017 14:01:08 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.2 Auto publishing reports to Tableau Server https://pxm-software.com/auto-publishing-reports-to-tableau-server/ Sat, 09 Jan 2016 11:03:38 +0000 https://proxmedia.pl/en/?p=5168 Tableau it’s a great tool for data visualization, however if you are using it a lot, you may want to automate some stuff. One of them is publishing/updating reports to Tableau Server. This is when Tableau Utility Command comes in handy, you can install it on your development server and use in the power-shell script. […]

Artykuł Auto publishing reports to Tableau Server pochodzi z serwisu Proxmedia.

]]>
Tableau it’s a great tool for data visualization, however if you are using it a lot, you may want to automate some stuff. One of them is publishing/updating reports to Tableau Server. This is when Tableau Utility Command comes in handy, you can install it on your development server and use in the power-shell script. One of the solutions is to use CI server for auto deployments, you only need to give the git/svn access for users changing the reports or adding new ones.

tableau

Following script can be run as build step in TeamCity to detect workbooks that have been changed recently and publish them automatically to Tableau server. Parent folder of each workbook will be used as project name when publishing. In order to run it, just pass in email notification list and server password – of course you need to configure the params (server url, smtp etc.).

param (
[string]$cmddir = "C:\Program Files\Tableau\Tableau Server\8.2\extras\Command Line Utility", #location where tabcmd has been installed
[string]$server = "https://tableau:81", #this is url of the Tableau server 
[string]$currentDir = (split-path -parent $MyInvocation.MyCommand.Definition) +"\", #current script location
[string]$notificationEmailList = "test1@test.com,test2@test.com", #send email notifications if successful
[string]$admin = "user", #admin account for the server
[string]$pass = "" #to be passed in as param
)
 
function SendEmail($emailTo,$title,$body)
{ 
   $smtp=new-object Net.Mail.SmtpClient("my_smtp_server"); $smtp.Send("sentAs@mydomain.com", $emailTo, $title, $body);
}
 
$global:temp_ = "";
 
#login to Tableau
cd  $cmddir
.\tabcmd login -s $server -u $admin -p $pass
 
 get-childitem -Path $currentDir –recurse |  where-object { 
    $_.LastWriteTime -gt (get-date).AddMinutes(-10) -and $_.FullName.EndsWith(".twb")
  } | 
  Foreach-Object {
 
       [string]$projectName = [System.IO.DirectoryInfo]$_.Directory.Name;
        $global:temp_ += [string][System.IO.Path]::GetFileName($_.FullName) + " | ";
 
       #publish or overwrite workbook on the server
       .\tabcmd publish $_.FullName -r $projectName  -o  
  } 
 
 
#more commands
#.\tabcmd publish "workbook.twbx" -r "project name" -n "Workbook Name" --db-user "" --db-password "" -o
 
 
#log out to release the session
.\tabcmd logout
 
if(-not $global:temp_ -eq "")
{
   SendEmail $notificationEmailList "Tableau report published" "Following report(s) has just been successfully published to Tableau Server: $global:temp_"
}

enjoy!

Artykuł Auto publishing reports to Tableau Server pochodzi z serwisu Proxmedia.

]]>
Auto creating installers using WIX (Open source) and CI server https://pxm-software.com/auto-creating-installers-using-wix-open-source-and-ci-server/ Mon, 02 Nov 2015 21:03:27 +0000 https://proxmedia.pl:99/en/?p=4195 Creating installer packages has become more complicated recently as the Microsoft decided to not to support installation projects (.vdproj) in VS 2012 upwards. On the top of that there is additional difficulty if we want to auto-create installers on the server that hasn’t got commercial Visual Studio installed – we just cannot do it 🙁 […]

Artykuł Auto creating installers using WIX (Open source) and CI server pochodzi z serwisu Proxmedia.

]]>
Creating installer packages has become more complicated recently as the Microsoft decided to not to support installation projects (.vdproj) in VS 2012 upwards. On the top of that there is additional difficulty if we want to auto-create installers on the server that hasn’t got commercial Visual Studio installed – we just cannot do it 🙁

The perfect solution for that is to use WIX installer (Open source). WIX enables you to build installation packages from xml files without relying on Visual Studio. The only component required is to install WIX toolset on the build server (wixtoolset.org)

The main WIX file is the .wixproj file that builds the required package either from inside Visual Studio or from command line – without VS. In that project file we can find some commands that use WIX heat, candle and light modules to create required msi files. See sample below:

Harvesting module will create dynamic list of all files that need to be included into installer. This list is gathered after our website app is published to the local folder:

  <Exec Command="&quot;$(WixPath)heat.exe&quot; dir $(PublishF) -dr INSTALLLOCATION -ke -srd -cg MyWebWebComponents -var var.publishDir -gg -out WebSiteContent.wxs" ContinueOnError="false" WorkingDirectory="." />

After harvesting, we can finally build installer. Please note that we can pass in custom parameters when triggering candle module, the parameters must be preceded with the letter “d” e.g. -dMyParam. In our sample we will use this to pass in command line arguments that will define our installer’s name, version, website root etc.

 <Exec Command="&quot;$(WixPath)candle.exe&quot; -ext WixIISExtension -ext WixUtilExtension -ext WiXNetFxExtension -dProductName=$(ProductName) -dWebFolderName=$(WebFolderName) -dProductVersion=$(ProductVersion) -dUpgradeCode=$(UpgradeCode) -dProductCode=$(ProductCode) -dpublishDir=$(PublishF) -dMyWebResourceDir=. @(WixCode, ' ')" ContinueOnError="false" WorkingDirectory="." />

 <Exec Command="&quot;$(WixPath)light.exe&quot; -ext WixUIExtension -ext WixIISExtension -ext WixUtilExtension -ext WiXNetFxExtension -out $(MsiOut) @(WixObject, ' ')" ContinueOnError="false" WorkingDirectory="." />

For the purpose of this article we will create very simple installer without configuring IIS settings, deploying sql scripts etc. All installer’s basic configuration can be found in UIDialogs.wxs, MyWebUI.wxs, IisConfiguration.wxs – you may want to adjust it for your needs. The file Product.wxs is the main entry point where you can define installation folders, actions, validations etc.

Please note that adding UpgradeCode and MajorUpgrade configuration elements will result in any older installed version to be automatically un-installed prior the installation. Also, any custom params you want to use inside your installer configuration must be used like this $(var.MyParamName).

 <Product Id="$(var.ProductCode)" 
			 Name="$(var.ProductName)" 
			 Language="1033" 
			 Version="$(var.ProductVersion)" 
			 Manufacturer="MyCompany" 
			 UpgradeCode="$(var.UpgradeCode)" >
 <MajorUpgrade DowngradeErrorMessage="A newer version of Product Name Removed is already installed."/> 
 
 .....

After defining our WIX configuration files, we can finally build our installer. We can do it by using batch or Power Shell script, or use Visual Studio to do it. It is because the .wixproj file it’s just normal ms-build file.

When using CI server we can simply run power-shell installation process by passing in our version number (build number, revision etc.) and other parameters we need.

This is how we can run it with PS:
WIX_installer-powershell

We can still use Visual Studio to do the same, as presented below:
WIX_installer-visualstudio

I have attached working sample below. Please don’t forget to install WIX toolset before running it. Also please check if all paths to WIX and msbuild folders are the same on your server (or use environment variables instead).

Enjoy!
MyWebsite

Artykuł Auto creating installers using WIX (Open source) and CI server pochodzi z serwisu Proxmedia.

]]>
Deleting old files by using schedule task and PowerShell https://pxm-software.com/deleting-old-files-by-using-schedule-task-and-powershell/ Wed, 29 Jul 2015 20:45:08 +0000 https://proxmedia.pl:99/en/?p=4175 If you are administrator or programmer, you probably were in situation that you wanted to free up some server space by deleting old unused files. Following script may be quite useful if you want to get rid of old files of the certain type – also if they are packed into .zip packages. Script checks […]

Artykuł Deleting old files by using schedule task and PowerShell pochodzi z serwisu Proxmedia.

]]>
If you are administrator or programmer, you probably were in situation that you wanted to free up some server space by deleting old unused files. Following script may be quite useful if you want to get rid of old files of the certain type – also if they are packed into .zip packages.

Script checks for all predefined file types with the creation date older than 6 months (can be configured). If an old file is detected then it is being deleted and that information is being stored to the local log.txt file. The old .zip files will also be deleted (containing only the files we want to delete).

The recommended way of using it is to create windows schedule task that runs periodically, so the server space usage is always optimized.

delete_old_files

 param (
[int]$TestMode = 1,
[int]$DeleteOlderThanMonths = 6,
[string]$DeleteFileExtension = ".jpg",
[string]$Root_Folder = "C:\",
[string]$Logfile = "Log.txt"
)

#writes log messages
Function LogWrite
{
   Param ([string]$logstring)
   $currentdate = get-date -format "M-d-yyyy HH:MM"
   Add-content $Logfile -value "$currentdate | $logstring"
}

#checks if zip file contains only DeleteFileExtension file type
Function CanZIPbeDeleted($zipFilePath)
{
  [int]$canBeDeleted = 1;

  try { 
      $zipArchive = Get-Item -Path $zipFilePath | ForEach-Object { 
         (New-Object -ComObject Shell.Application).NameSpace($_.FullName)
      }
  
      foreach($item in $zipArchive.items())
      {
        if(!$item.Path.EndsWith($DeleteFileExtension))
        {
           $canBeDeleted = 0
           $msg = "$zipFilePath cannot be deleted as it contains non $DeleteFileExtension file"
           LogWrite $msg
           Write-Host $msg -foreground Green 
           return 0
        }
     }   
   } 
   catch { 
       $msg = "Unexpected Error. Error details: $_.Exception.Message"
       LogWrite $msg
       Write-Host $msg -foreground Red 
       return 0
   } 
 
    return $canBeDeleted;
}

#deletes the file
Function DeleteFile($filePathToDelete)
{
  try
  { 
     if($TestMode -eq 0)
     {
       Remove-Item $filePathToDelete
     }
     
     $size = (Get-Item -Path $filePathToDelete | Select-Object Name, @{Name="Size";Expression={[string]::Format("{0:0.00} MB", $_.Length / 1MB)}}).Size
     $msg = "File deleted ($size): $filePathToDelete"
     LogWrite $msg 
     Write-Host $msg -foreground "magenta"  
  } 
  catch { 
     $msg = "Unexpected Error. Error details: $_.Exception.Message"
     LogWrite $msg 
     Write-Host $msg -foreground Red 
   }    
}

 
 get-childitem -Path $Root_Folder –recurse |  
 where-object { 
    $_.CreationTime -lt (get-date).addMonths(-$DeleteOlderThanMonths) -and ($_.FullName.EndsWith($DeleteFileExtension) -or $_.FullName.EndsWith(".zip"))
  } | 
 Foreach-Object {
 
    if ($_.FullName.EndsWith(".zip"))
    {
       $canDeleteZip = CanZIPbeDeleted $_.FullName
       if($canDeleteZip -eq 1)
       {
          DeleteFile $_.FullName
       }
    }
    else
    {
       DeleteFile $_.FullName
    }
 } 

Enjoy!
delete_old_files

Artykuł Deleting old files by using schedule task and PowerShell pochodzi z serwisu Proxmedia.

]]>
SVN log parsing using PowerShell https://pxm-software.com/svn-log-parsing-using-powershell/ Wed, 22 Jul 2015 20:34:18 +0000 https://proxmedia.pl:99/en/?p=4161 This script can be useful when gathering SVN release changes that occurred since the last release. It connects to SVN server, gets logs from within date range (last build and date now). Logs are then filtered, cleaned (removed if no comments were added). Next, email message is sent based on the retrieved data. To be […]

Artykuł SVN log parsing using PowerShell pochodzi z serwisu Proxmedia.

]]>
This script can be useful when gathering SVN release changes that occurred since the last release. It connects to SVN server, gets logs from within date range (last build and date now). Logs are then filtered, cleaned (removed if no comments were added). Next, email message is sent based on the retrieved data. To be able to run the script, you need to configure it by setting-up your own svn server url, credentials, also please make sure you have enabled running ps scripts on your server.

Enjoy!

svn_log_parser

 param (
[string]$Version = "2.12.0.2",
[string]$SendEmailsTo = "test@test.com",
[string]$SVNUser = "user",
[string]$SVNPass = "pass",
[string]$SVNbranchRoot = "https://svn.test.com/www/Branches/MyApp/2.12.0"
)

#get svn logs for the current release
 Function GetSVNLogs($lastBuildDate_)
 {
    [string]$fromDate = $lastBuildDate_.ToString("yyyy-MM-dd");
    [string]$toDate = (get-date).ToString("yyyy-MM-dd");
    [string]$RevisionDates = "{$fromDate}:{$toDate}"; 

    #add -v param for verbose
    1$log = 1(&$svn log $SVNbranchRoot -r $RevisionDates --limit 500 --xml --username $SVNUser --password $SVNPass);

    $logObjects = $log.log.logentry | Foreach-Object {
            $logEntry = $_

            $logEntry | Select-Object `
                @{ Name = "Revision"; Expression = { [int]$logEntry.revision } },
                @{ Name = "Author"; Expression = { $logEntry.author } },
                @{ Name = "Date"; 
                   Expression = {
                       if ( $NoFormat )
                       {
                           [datetime]$logEntry.date
                       }
                       else
                       {
                           "{0:dd/MM/yyyy hh:mm:ss}" -f [datetime]$logEntry.date
                       }
                   } },
                @{ Name = "Message"; Expression = { $logEntry.msg } } | 

            Foreach-Object {        
                $_ | where-object { ![string]::IsNullOrEmpty($logEntry.msg) }  | Select-Object  Author, Date, Message              
            }
      }

    return $logObjects
 }

#send email function
 Function SendEmail($SendEmailsTo_, $EmailSubject_, $changes_)
 {
   $emailFrom = "automation@test.com" 
   $smtpserver="smtp.test.com" 
   $smtp=new-object Net.Mail.SmtpClient($smtpServer) 

   foreach ($email in $SendEmailsTo_.split(';'))
   {   
      $smtp.Send($emailFrom, $email, $EmailSubject_, $changes_)
   }
 }

#get svn logs since the last release
$SVNChanges = GetSVNLogs $lastBuildDate

$changes += "`r`n`r`n";
$changes += "SVN changes sice last release`r`n";
$changes += $SVNChanges | Format-Table -AutoSize | Out-String;
$changes += "`r`n-------------------------------------------------------------------`r`n";
$changes += "This is automated email, please do reply directly!";

#send email 
$EmailSubject = "Release $Version changes";

SendEmail $SendEmailsTo $EmailSubject $changes;

Artykuł SVN log parsing using PowerShell pochodzi z serwisu Proxmedia.

]]>
Using Power-Shell in automated tasks (batch script alternative) https://pxm-software.com/using-power-shell-in-automated-tasks-batch-script-alternative/ Sun, 28 Jun 2015 20:16:10 +0000 https://proxmedia.pl:99/en/?p=4136 Power Shell it’s the Microsoft’s Automation framework sitting on top of .Net (System.Management.Automation) that lets us perform advanced automated tasks. It can help not only the administrators but also the programmers to save a lot of time by automating repeatable processes. The script’s syntax is quite intuitive and easy to learn so it allows users […]

Artykuł Using Power-Shell in automated tasks (batch script alternative) pochodzi z serwisu Proxmedia.

]]>
Power Shell it’s the Microsoft’s Automation framework sitting on top of .Net (System.Management.Automation) that lets us perform advanced automated tasks. It can help not only the administrators but also the programmers to save a lot of time by automating repeatable processes. The script’s syntax is quite intuitive and easy to learn so it allows users already writing e.g. batch scripts to grasp it very fast.

If we have for example an old batch file that executes sql scripts stored in the folder as follows:

 @ECHO OFF
 echo script folder: %1
 echo server: %2
 echo database: %3
 echo database user: %4
 echo ----------------

setlocal enabledelayedexpansion
cd /d %1

for %%d in (*.sql) do (
   set filecontent=
   for /f "delims=" %%a in (%%d) do (set "filecontent=!filecontent!%%a")
   set "temp=!filecontent!"
   set "filecontent=!temp:#PARAM#=%4!"
   echo executing %%d
   echo ----------
   sqlcmd -S%2 -E -Q"!filecontent!" -d%3 
)
pause

then the above script can be easily replaced with Power-Shell. Using PS gives us the same flexibility of scripting languages and at the same time allows us to interact with .Net framework when required.

Before we can start executing PS scripts on our computer, we need to enable execution which is disabled by default. We can do it using following cmdlets:

 //enable executing 
 Set-ExecutionPolicy RemoteSigned

 //get back to default settings
 Set-ExecutionPolicy Default

After we’ve created test.ps1 file, we can start implementation. In order to make testing easier we can define default params inside the script using param(…) constructor. Please note that by using [string] keyword we actually creating strongly typed objects. Defined parameters will be overwritten by the user’s passed-in params.

ps-script

The first command (dir $ScriptFolder) just lists all files within the directory, then ForEach-Object function iterates through all files getting their content into $query variable. Next the $query text is being replaced with our parameter if we have any. The final task is just to execute SQLCMD command passing in query and database connection settings.

param (
[string]$ScriptFolder = "Database\Procedures",
[string]$Server = ".\SQL2008R2",
[string]$Database = "MyDb",
[string]$DatabaseUser = "user"
)


(dir $ScriptFolder) | 
  ForEach-Object { 
    $query = get-content $_.FullName -Raw
    $query = $query.Replace("#PARAM#","user");
    SQLCMD -S $Server -E -Q $query -d $Database
   }

In order to run our Power-Shell script we need just execute following line:

 //command line
 powershell -noexit C:\test.ps1 -ScriptFolder "C:\Database\Procedures" -Server ".\SQL2008R2" -Database "MyDB" -DatabaseUser "user"

Good luck with your Power-Shell scripting 🙂

Artykuł Using Power-Shell in automated tasks (batch script alternative) pochodzi z serwisu Proxmedia.

]]>
Deploying website files to multiple folders with exclusions https://pxm-software.com/deploying-website-files-to-multiple-folders-with-exclusions/ Tue, 26 May 2015 19:05:31 +0000 https://proxmedia.pl:99/en/?p=4055 When automating deployments it is sometimes needed to create copy of your website/folder in different locations with some excluded files e.g. configuration files or temp folders. The script below will help you to achieve that. It can be triggered automatically on your post-deployment action (possibly in your CI server) or executed manually. Script just copies […]

Artykuł Deploying website files to multiple folders with exclusions pochodzi z serwisu Proxmedia.

]]>
When automating deployments it is sometimes needed to create copy of your website/folder in different locations with some excluded files e.g. configuration files or temp folders.

The script below will help you to achieve that. It can be triggered automatically on your post-deployment action (possibly in your CI server) or executed manually. Script just copies all files from source folder into multiple locations, except for the files and sub-folders you specify. Files with the same names will be overridden in target locations.

Copy_websites

param (
##main paths
[string]$Source_wwwroot = "c:\inetpub\wwwroot\sourceApp",
[string]$Copy1_wwwroot = "c:\inetpub\wwwroot\copy1App",
[string]$Copy2_wwwroot = "c:\inetpub\wwwroot\copy2App",
##copy exceptions
$exclude = @("Web.config","Web.Debug.config","Web.Release.config"),
$excludeMatch = @("TempFiles","TempImages","obj")
)

if(-not(Test-Path $Source_wwwroot)) { "Source folder not found!"; return;}
if(-not(Test-Path $Copy1_wwwroot)) { "Copy1 folder not found!"; return;}
if(-not(Test-Path $Copy2_wwwroot)) { "Copy2 folder not found!"; return;}

If (-NOT ([Security.Principal.WindowsPrincipal][Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator"))
{ 
  "You need to run this script as admin!";
  sleep 3000
  return;
}

Write-Host "`n#######################`nThis script will copy files from: `n$Source_wwwroot `nto: `n$Copy1_wwwroot and`n$Copy2_wwwroot `n";

Write-Host "except files: $exclude";
Write-Host "except folders: $excludeMatch";

Read-host -prompt "`n#######################`nPress enter to start copying the files"

function CopyFiles($from, $to){
    [regex] $excludeMatchRegEx = ‘(?i)‘ + (($excludeMatch |foreach {[regex]::escape($_)}) –join “|”) + ‘’
    Get-ChildItem -Path $from -Recurse -Exclude $exclude | 
     where { $excludeMatch -eq $null -or $_.FullName.Replace($from, "") -notmatch $excludeMatchRegEx} |
     Copy-Item -Destination {
      if ($_.PSIsContainer) {
       Join-Path $to $_.Parent.FullName.Substring($from.length)
      } else {
       Join-Path $to $_.FullName.Substring($from.length)
      }
     } -Force -Exclude $exclude
 }

 "copying files for copy1..."
 CopyFiles $Source_wwwroot $Copy1_wwwroot

 "copying files for copy2..."
 CopyFiles $Source_wwwroot $Copy2_wwwroot

"Done!";
sleep 10000

Enjoy!
Copy_websites

Artykuł Deploying website files to multiple folders with exclusions pochodzi z serwisu Proxmedia.

]]>
PowerShell unattended execution with auto-retry and SMS alerts https://pxm-software.com/powershell-unattended-execution-with-auto-retry-and-sms-alerts/ Sat, 02 May 2015 20:49:15 +0000 https://proxmedia.pl:99/en/?p=4183 Running automated processes seems to be an very easy task. However, this may become a little bit more complicated for running critical unattended operations. Imagine the situation when you have to run critical, automated process at night while your business relies on successful completion of it. For example, if you are running ETL process at […]

Artykuł PowerShell unattended execution with auto-retry and SMS alerts pochodzi z serwisu Proxmedia.

]]>
Running automated processes seems to be an very easy task. However, this may become a little bit more complicated for running critical unattended operations. Imagine the situation when you have to run critical, automated process at night while your business relies on successful completion of it.

For example, if you are running ETL process at night that loads the data to be available for the users next morning. You want to be warned if something is wrong as you won’t have enough time to load it next morning say before 9am. During the process, you may get either data or the infrastructure related error – one of the servers may be down for a few seconds causing entire operation to fail.
Automated-execution
The script below resolves some of the potential problems that may occur in similar scenario by auto-retrying processes on error and sending SMS alerts directly to your mobile, so you will probably be alerted much earlier unless you read emails at night 🙂

 

In the script, you can define multiple input files to be run by the process you define. For example you may define multiple dtx packages to be run one by one. Whatever the process will run, you need to make sure that it will return 0 on successful completion or return non zero integer on error. If the exe process will not close itself returning the exit code, then it will hang forever.

Except for input files and exe process, you also need to define execution wait times, email recipient list, smtp server and database connection if you need to log audit information. If you want to send SMS alerts, you will also need to create account on txtlocal.co.uk.

For this example I have defined three csv files (data1.csv, data2.csv, data2.csv) to be run in notepad (with auto retry twice for each of them), closing notepad manually will simulate successful completion. Of course you need to adjust it to your requirements.

 

 param (
[string[]]$inputFiles = @(
#add paths to more files if needed here
#format: input file path;NumberOfRetries,
#########################################

"c:\data1.csv;2",
"c:\data2.csv;2",
"c:\data3.csv;2",

#########################################
""),
[string]$exePath = "C:\Windows\System32\notepad.exe",
[string]$LoadSuccessfulNotificationList = "email1@mydomain.com;email2@mydomain.com",
[string]$ErrorNotificationList = "email1@mydomain.com;email2@mydomain.com",
[int]$RetryWaitTimeSec = 60,
[int]$NextFileExecWaitTimeSec = 3,
[string]$connString = "Data Source=myServer,1433;Initial Catalog=MyDatabase;integrated security=SSPI;persist security info=False;Trusted_Connection=Yes"
)

[System.Reflection.Assembly]::LoadWithPartialName("System.Web") | out-null
[System.Reflection.Assembly]::LoadWithPartialName("System.Web.Extensions") | out-null

##define phone numbers to send error alerts to (format: 441234567890,441234567892)
$global:numbers = [System.Web.HttpUtility]::UrlEncode("441234567890");

#####################no need to edit below

$global:LogInfo = "";
$global:SmsLogInfo = "";
$global:errorFlag = 0;
[int]$firstFlag = 1;


function SendSms([string]$smsMessage)
{
    if($smsMessage.trim().Length -eq 0) { return;}
    if($smsMessage.trim().Length -gt 155) 
    {
      $smsMessage = $smsMessage.trim().Substring(0,150) + "...";
    }

    $username = [System.Web.HttpUtility]::UrlEncode("mail@mydomain.com");
    $hash = [System.Web.HttpUtility]::UrlEncode("my hash - get it from txtlocal.co.uk");##if not working please check credentials at https://control.txtlocal.co.uk/docs/
    ###
    $message = [System.Web.HttpUtility]::UrlEncode($smsMessage);
    $sender = [System.Web.HttpUtility]::UrlEncode("My alerts");
    ###
    $data = 'username=' + $username + '&hash=' + $hash + '&numbers=' + $global:numbers + "&sender=" + $sender + "&message=" + $message;
    $response = Invoke-WebRequest -Uri "https://api.txtlocal.com/send/?$data";
    ###
    $ser = New-Object System.Web.Script.Serialization.JavaScriptSerializer;
    $result = $ser.DeserializeObject($response.Content);

    return $result;
}

Function LogToDatabase($exitCode,$inputFilePath,$duration,$cpu,$description)
{
    $inputFileName = [System.IO.Path]::GetFileName($inputFilePath);

    $SqlConnection = new-object System.Data.SqlClient.SqlConnection;
    $SqlConnection.ConnectionString = $connString;
    $SqlCommand = $SqlConnection.CreateCommand();
    $SqlCommand.CommandType = [System.Data.CommandType]::StoredProcedure;
    $SqlCommand.CommandText = "MonitorAudit_Save";

    $SqlCommand.Parameters.AddWithValue("@PCName", [Environment]::MachineName + " - " + [Environment]::UserDomainName + "/" + [Environment]::UserName) 
    $SqlCommand.Parameters.AddWithValue("@ExePath", $exePath) | Out-Null

    $SqlCommand.Parameters.AddWithValue("@ExitCode", $exitCode) | Out-Null
    $SqlCommand.Parameters.AddWithValue("@InputFilePath", $inputFilePath) | Out-Null
    $SqlCommand.Parameters.AddWithValue("@InputFileName", $inputFileName) | Out-Null

    $SqlCommand.Parameters.AddWithValue("@Duration", $duration) | Out-Null

    $SqlCommand.Parameters.AddWithValue("@CPU", $cpu) | Out-Null
    $SqlCommand.Parameters.AddWithValue("@Description", $description) | Out-Null

    $SqlConnection.Open()
    $SqlCommand.ExecuteNonQuery()
    $SqlConnection.Close()
}

 Function SendEmail($SendEmailsTo_, $EmailSubject_, $changes_)
 {
   $emailFrom = "errors@mydomain.com" 
   $smtpserver="mysmtp_server" 
   $smtp=new-object Net.Mail.SmtpClient($smtpServer) 
 
   foreach ($email in $SendEmailsTo_.split(';'))
   {   
      if($email.Length -gt 0)
      {
        $smtp.Send($emailFrom, $email, $EmailSubject_, $changes_);
      }
   }
 }
 
$inputFiles | Foreach-Object {
   
  if($_.trim().Length -gt 0)  {

      [int]$numOfRetries = 1;
      if($_.split(';').Length -gt 1) {
        $numOfRetries = $_.split(';')[1]; if($numOfRetries -lt 1) { $numOfRetries = 1 }
      }

      if($firstFlag -eq 0 -and $NextFileExecWaitTimeSec -gt 0){
        $global:LogInfo += "Waiting $NextFileExecWaitTimeSec seconds to execute next input file...`r`n`r`n";
        Start-Sleep -s $NextFileExecWaitTimeSec
      }
      if($firstFlag -ne 0) { $firstFlag = 0;}

      for ($i = 0; $i -le $numOfRetries -1; $i++) {
       try
        {
           [string]$inputFilePath = $_.split(';')[0]; "processing $inputFilePath ..."; $dateStart = (Get-Date).ToString(); [DateTime]$dateStartObj = (Get-Date);

           if(-not(Test-Path $inputFilePath))
           {
              throw New-Object System.Exception ("Input file not found: $inputFilePath");
           }

           if(-not(Test-Path $exePath))
           {
              $global:errorFlag =1;
              throw New-Object System.Exception ("Process exe not found: $exePath");
           }

           #add -windowstyle Hidden when in production
           $exeProcess= Start-Process $exePath -Wait -PassThru -ErrorAction SilentlyContinue  -WarningAction SilentlyContinue -ArgumentList " `"$inputFilePath`"" 
           

           #log info
           $cpu = $exeProcess.CPU.ToString("##.#"); $dateEnd = (Get-Date).ToString(); $global:LogInfo += " $inputFilePath `r`nAverage CPU: $cpu | Start time $dateStart | End time $dateEnd`r`n`r`n";

           if(-not $exeProcess.ExitCode -eq 0)
           {
              throw New-Object System.Exception ("execution error");
           }
           
           #uncomment below line to log to database
           #LogToDatabase $exeProcess.ExitCode $inputFilePath (Get-Date).Subtract($dateStartObj).TotalSeconds $exeProcess.CPU.ToString("##.#") "OK" | Out-Null

           break;
        } 
        catch { 
           $global:errorFlag =1;
          try
           {
           $inputFileName = [System.IO.Path]::GetFileName($inputFilePath);
           } catch { $inputFileName = ""; }

           $msg = "$LogInfo Unexpected Error when processing $inputFileName. Error: $_";  Write-Host $msg -foreground Red;
           $global:LogInfo += "###########`r`nUnexpected Error when processing $inputFileName`r`n" + "Error: $_ `r`n##########`r`n`r`n";
 
           $global:SmsLogInfo += "Error processing: " + $inputFileName + " |"; 

           $cpu = "0";
           if ($exeProcess -ne $null) { $cpu = $exeProcess.CPU.ToString("##.#"); }

           #uncomment below line to log to database
           #LogToDatabase 1 $inputFilePath (Get-Date).Subtract($dateStartObj).TotalSeconds $cpu "Error: $_" | Out-Null

           #SendEmail $ErrorNotificationList "loading error" $msg
           if($i -gt $numOfRetries - 1 -or $i -eq $numOfRetries -1) { break;};

           #retry it
           $global:LogInfo += "Waiting $RetryWaitTimeSec seconds to retry...`r`n`r`n";
           Start-Sleep -s $RetryWaitTimeSec 
        } 
      }
   }
} 

if($errorFlag -eq 0) {
   SendEmail $LoadSuccessfulNotificationList "Loading process successful" "Processing has been successful for following input files: `r`n-------------------------`r`n$LogInfo Have a nice day :)";
}
else
{
   SendEmail $ErrorNotificationList "Loading process finished with errors" "Processing has NOT been successful for some of the input files: `r`n-------------------------`r`n$LogInfo";
   SendSms $global:SmsLogInfo;
}

Enjoy!
Automated-execution

Artykuł PowerShell unattended execution with auto-retry and SMS alerts pochodzi z serwisu Proxmedia.

]]>
Using automation for gathering release changes https://pxm-software.com/using-automation-for-gathering-release-changes/ Wed, 15 Apr 2015 20:29:51 +0000 https://proxmedia.pl:99/en/?p=4153 If you need to gather release changes when creating installers, the script below can help you. Script simply gets all or current database release changes stored in file. I use it to automatically gather the latest changes after TeamCity builds release installer. The changes could be sent to PR for documentation. You can of course […]

Artykuł Using automation for gathering release changes pochodzi z serwisu Proxmedia.

]]>
If you need to gather release changes when creating installers, the script below can help you. Script simply gets all or current database release changes stored in file. I use it to automatically gather the latest changes after TeamCity builds release installer. The changes could be sent to PR for documentation. You can of course use more complicated patterns when extracting script text when required. To execute it, just pass in update script location, version number and email addresses.

release_changes

 

 param (
 [string]$ScriptFolder = "C:\Database\Updates",
 [string]$Version = "2.11.0.0",
 [string]$SendEmailsTo = "test@test.com",
 [boolean]$GetAllChanges = 0
)

#sends notification emails
 Function SendEmail($SendEmailsTo_, $EmailSubject_, $changes_)
 {
   $emailFrom = "teamcity@test.com" 
   $smtpserver="smtp.test.com" 
   $smtp=new-object Net.Mail.SmtpClient($smtpServer) 

   foreach ($email in $SendEmailsTo_.split(';'))
   {   
      $smtp.Send($emailFrom, $email, $EmailSubject_, $changes_)
   }
 }

 #get file by version number
 Function GetReleaseFileName($Version_)
 {
    $VersionFilename_ ="v";
     $Version_.split('.') | ForEach-Object{
        $VersionFilename_ +=  [System.Convert]::ToDouble($_).ToString("00"); 
    }
    $VersionFilename_ += ".sql";

     #format e.g. v12.12.00.10.sql
    return $VersionFilename_;
 }

#return if no emails added
if($SendEmailsTo.trim() -eq "") { "no emails defined!"; return;}

$VersionFilename =  GetReleaseFileName $Version;

$EmailSubject = "Release $Version database changes";
$changes = "Changes in release file $VersionFilename `r`n `r`n";
$hasChanges = 0;

if(!$GetAllChanges) {  
    (dir $ScriptFolder) | sort CreationTime -Descending |
     ForEach-Object {  
        $fileContent = get-content ($_.Directory.FullName + "\" + $VersionFilename) -Raw;

        $patterns = 'alter','drop','insert','delete','update','create'; #paterns/strings to find
        $hasChanges = 0;
        foreach ($pattern in $patterns)
        {
           $fileContent.split("`r`n") | ?  { $_ -match $pattern } | % { $changes += $_.trim() + "`r`n"; $hasChanges = 1; } 
        } 

        if($hasChanges -eq 0) { $changes += "`r`nNo database changes for release $Version !"}

        write-output $changes

        SendEmail $SendEmailsTo $EmailSubject $changes;

        break;        
    }
}
else
{
  #parse all files for all previous releases
  $EmailSubject = "All database changes";
  $changes = "All database changes: `r`n `r`n";

  (dir $ScriptFolder) | sort CreationTime -Descending |
    ForEach-Object {  
        $fileContent = get-content ($_.Directory.FullName + "\" + $_.name) -Raw;

        $patterns = 'alter','drop','insert','delete','update','create';
        $changes += "`r`nChanges in release file $_`r`n";
        $hasChanges = 0;

        foreach ($pattern in $patterns)
        {          
           if($fileContent)
           {
              $fileContent.split("`r`n") | ?  { $_ -match $pattern } | % { $changes += $_.trim() + "`r`n"; $hasChanges = 1; }  
           } 
        } 

        if($hasChanges -eq 0) { $changes += "`r`nNo database changes for release " + $_.name.Replace(".sql","").Replace("v0","");}

        $changes += "`r`n-------------------------------`r`n";      
    }

    write-output $changes 

    SendEmail $SendEmailsTo $EmailSubject $changes;
}

I have included full script sample below
release_changes_notifier

Artykuł Using automation for gathering release changes pochodzi z serwisu Proxmedia.

]]>
Getting data from TeamCity Rest API – PowerShell https://pxm-software.com/getting-data-from-teamcity-rest-api-powershell/ Sun, 05 Apr 2015 14:05:19 +0000 https://proxmedia.pl:99/en/?p=3981 Simple script that gets build data from TeamCity server rest api. When executed as a build step, you can pass in current build id to it. Having that, we can get the previous build and it’s data. Having that data we can proceed with further processing which can be for example gathering documentation, statistics etc. […]

Artykuł Getting data from TeamCity Rest API – PowerShell pochodzi z serwisu Proxmedia.

]]>
Simple script that gets build data from TeamCity server rest api. When executed as a build step, you can pass in current build id to it. Having that, we can get the previous build and it’s data. Having that data we can proceed with further processing which can be for example gathering documentation, statistics etc. Please make sure you have enabled running ps scripts on your server.

Enjoy!

teamcity_api

 param (
[int]$BuildId = 2274,
)

#gets last build date by build id
Function GetLastBuildDateByBuildId($LastBuildId_)
{
  $secpasswd = ConvertTo-SecureString "mypassword" -AsPlainText -Force
  $cred = New-Object System.Management.Automation.PSCredential("username", $secpasswd)

  $build = (Invoke-RestMethod -Credential $cred -Uri "https://teamcity:88/httpAuth/app/rest/builds/id:$LastBuildId_").build;
  return [DateTime]::ParseExact($build.startdate.split("T")[0],'yyyyMMdd', $null) 
}

#get last build date
$lastBuildDate = GetLastBuildDateByBuildId  ($BuildId -1)

$lastBuildDate

Artykuł Getting data from TeamCity Rest API – PowerShell pochodzi z serwisu Proxmedia.

]]>
Live database restoration to test server using Powershell https://pxm-software.com/live-database-restoration-to-test-server-using-powershell/ Tue, 24 Mar 2015 20:39:26 +0000 https://proxmedia.pl:99/en/?p=4167 It is very common requirement during development stage to copy live database to test server on the daily/weekly basis. In such a scenario there is also very often a need to clean the database and perform some operations before restoring it in test environment. For example if the live database is very big and you […]

Artykuł Live database restoration to test server using Powershell pochodzi z serwisu Proxmedia.

]]>
It is very common requirement during development stage to copy live database to test server on the daily/weekly basis. In such a scenario there is also very often a need to clean the database and perform some operations before restoring it in test environment.

For example if the live database is very big and you don’t need to have all tables on your test server, you can drop it during the copy process.
Similar applies if you need to adjust permissions, change the database name, optimize indexes or just shrink the database file.

CopyLiveDBToTest-300x137

The script below will help you automate the copy database process. On every run, it will check for the latest database backup file in defined location.
Next, it will copy that file to your local temp folder. After that, it will restore the database by giving it a temp name, so you can continue working while the restoration lasts (useful for big databases).
When the database is restored, it will drop the old test database and rename the current one to the same name. Also the old database files will be deleted to free-up the disc space. The database files will have the names like “TestDB_19_08_2014.mdf”.

As a result, you will have up-to-date live database restored on your test server.

 param (
#local test datatabase server to restore your live database 
[string]$TargetSQLServer= ".\MSSQLSERVER_2012",

#name of your live database
[string]$LiveSQLDatabase= "TestDB",

#name of your final test database being set after restoration
[string]$TargetSQLDatabase= "TestDB_DEV",

#data root of mssql server - containing .mdf files of your test database
[string]$TargetSQLDatabaseDataRoot= "C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER_2012\MSSQL\DATA",

#folder where current backup file will be copied and processed - must be local folder
[string]$TempLocalFolder= "C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER_2012\MSSQL\Backup_temp",

#backup location of your live database - can be located on the network drive
[string]$BackupLiveFolder = "C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER_2012\MSSQL\Backup",

#show copy progress or copy in bacground
[bool]$silentCopying = 0,

#log location
[string]$Logfile = "C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER_2012\MSSQL\Log\CopyLiveDBToTestLog.txt"
)

Function LogWrite
{
   Param ([string]$logstring)
   $currentdate = get-date -format "M-d-yyyy HH:mm:ss"
   Add-content $Logfile -value "$currentdate | $logstring"
   write-host "$currentdate | $logstring"
}

try
{ 
    LogWrite "acquiring backup file..."
    
    #get most recent backup file
    $liveBackupFile  = (dir -Path $BackupLiveFolder) | where {$_.extension -eq ".bak" }  | 
       sort lastwritetime –Descending | select -first 1;
      
    LogWrite "validating paths..."
    
    if($liveBackupFile -eq $null -or -not (Test-path $liveBackupFile.FullName)){ 
       LogWrite "Could not find the backup file, check the backup folder location!";
       return;}

    if(-not(Test-path $TempLocalFolder)) { LogWrite "Incorrect local backup location!"; return;}

    #copy backup file to temp location
    if(-not(Test-Path "$TempLocalFolder\$liveBackupFile"))
    {
       #remove all backups in temp location   
        dir -path $TempLocalFolder |  Foreach-Object {
        get-childitem -Path $_.FullName |  
         where-object { 
            ($_.FullName.tolower().EndsWith(".bak") -and $_.FullName.tolower().contains($LiveSQLDatabase.ToLower()))
          } | 
         Foreach-Object {
            LogWrite "removing file: $_.FullName"
            remove-item -path $_.FullName
          } 
        }

        LogWrite "copying backup file $liveBackupFile ..."
        
        if(-not($silentCopying -eq 1))
        {
            #use shell to see progress
            $FOF_CREATEPROGRESSDLG = "&H0&"
            $objShell = New-Object -ComObject "Shell.Application"
            $objFolder = $objShell.NameSpace($TempLocalFolder) 
            $objFolder.CopyHere($liveBackupFile.FullName, $FOF_CREATEPROGRESSDLG)
        }
        else
        {
           copy-item -Path $liveBackupFile.FullName -Destination "$TempLocalFolder\$liveBackupFile" -PassThru -Verbose
        }

        LogWrite "copying backup file $liveBackupFile finished"
    }
    else
    {
       LogWrite "backup file already exists: $TempLocalFolder\$liveBackupFile"
    }

    #----------------------------------------------------------

    LogWrite "connecting to $TargetSQLServer"

    $TargetSQLDatabase_temp = $TargetSQLDatabase + "_"+ (get-date).tostring("yy_MM_dd");
    
    $mdfCurrentPath = "$TargetSQLDatabaseDataRoot\$TargetSQLDatabase_temp.mdf"
    if(Test-Path $mdfCurrentPath)
    {
        LogWrite "database $TargetSQLDatabase_temp.mdf already exists!"
        return;
    }

    LogWrite "restoring database from file: $TempLocalFolder\$liveBackupFile to temp database $TargetSQLDatabase_temp ..."

    $DBLogicalFileName = $LiveSQLDatabase;
    $DBLogicalFileName_log = $LiveSQLDatabase + "_Log";

    $query = "RESTORE DATABASE [$TargetSQLDatabase_temp] FROM   
    DISK = N'$TempLocalFolder\$liveBackupFile' WITH  FILE = 1,  
    MOVE N'$DBLogicalFileName' TO N'$TargetSQLDatabaseDataRoot\"+$TargetSQLDatabase_temp+".mdf',   
    MOVE N'$DBLogicalFileName_log' TO N'$TargetSQLDatabaseDataRoot\"+$TargetSQLDatabase_temp+".ldf', 
    NORECOVERY,  NOUNLOAD,  REPLACE,  STATS = 10";


    SQLCMD -S $TargetSQLServer -E -Q $query 

    Start-Sleep -s 10

    #bring it to live from restoring mode
    $query = "RESTORE DATABASE "+$TargetSQLDatabase_temp+" WITH RECOVERY"
    SQLCMD -S $TargetSQLServer -E -Q $query 
    
    LogWrite "dropping old $TargetSQLDatabase database..."

    #drop current database
    $query = "ALTER DATABASE [$TargetSQLDatabase]
    SET OFFLINE WITH ROLLBACK IMMEDIATE
    DROP DATABASE [$TargetSQLDatabase]"
    SQLCMD -S $TargetSQLServer -E -Q $query 
    
    Start-Sleep -s 10
    
    LogWrite "renaming temp database $TargetSQLDatabase_temp to $TargetSQLDatabase ..."

    #rename temp database 
    $query = "ALTER DATABASE ["+$TargetSQLDatabase_temp+"] SET SINGLE_USER WITH ROLLBACK IMMEDIATE    
    ALTER DATABASE ["+$TargetSQLDatabase_temp+"] MODIFY NAME = [$TargetSQLDatabase]
    ALTER DATABASE [$TargetSQLDatabase] SET MULTI_USER"
    SQLCMD -S $TargetSQLServer -E -Q $query 

    LogWrite "recreating db permissions..."
    
    #recreate permissions
    $query = "ALTER AUTHORIZATION ON DATABASE::[$TargetSQLDatabase] TO [sa]
    use  [$TargetSQLDatabase]
    ALTER User [domain\Administrators] with LOGIN = [domain\Administrators]"

    #enable if used
    #SQLCMD -S $TargetSQLServer -E -Q $query 
           
    LogWrite "cleaning and shrinking database $TargetSQLDatabase ..."
    
    $query = "use $TargetSQLDatabase;
    ALTER DATABASE $TargetSQLDatabase SET RECOVERY SIMPLE;   
    truncate table dbo.TestTable;

    DBCC SHRINKFILE('$DBLogicalFileName_Log');
    DBCC SHRINKFILE('$DBLogicalFileName')"
    SQLCMD -S $TargetSQLServer -E -Q $query 
    
    
    LogWrite "optimizing table indexes..."
    
     $query = "USE [$TargetSQLDatabase];
        
        EXEC [dbo].[IndexOptimize]
		@IndexFragmentationMin  = 5,
        @IndexFragmentationRebuildMin  = 30,
		@skiplargetables = 0,
		@PrintInformationOnly = 0"
 
    #enable if used
    #SQLCMD -S $TargetSQLServer -E -Q $query 
    
    
    LogWrite "removing old database files..."
       
    #remove all old database files   
    $fileEntries = [IO.Directory]::GetFiles("$TargetSQLDatabaseDataRoot");
    $todayDatePart = (get-date).tostring("yy_MM_dd");

     foreach($fileName in $fileEntries) 
     { 
        if($fileName.tolower().contains("$TargetSQLDatabase".tolower()) -and -not($fileName.tolower().contains($todayDatePart)))
        {
           LogWrite $fileName
           #remove-item -path $fileName
        }
     }    
      
    LogWrite "Done!"

} 
catch { 
     $msg = "Unexpected Error. Error details: $_.Exception.Message"
     LogWrite $msg 
     Write-Host $msg -foreground Red 
}

Enjoy!
CopyLiveDBToTest

Artykuł Live database restoration to test server using Powershell pochodzi z serwisu Proxmedia.

]]>