Thursday, April 27, 2017

Send email from Azure SendGrid using PowerShell

Abstract

Sendgrid is popular. No doubt about it! Especially for Dev and Test scenarios where you get 25000 free emails a month. This post describes how can you use Sendgrid to send email using PowerShell.

Don’t expect me to write blog post on C# code for sending email using Sendgrid. Because, Sendgrid will keep updating their API (like V2, V3 and so on); ultimately C# code will also change. So you can refer C# or other languages code sample from SendGrid documentation itself.

Provision SendGrid account on Microsoft Azure

Let’s create a SendGrid account quickly on Azure. Login to Azure portal on the link – http://portal.azure.com . Sendgrid today available only in selected region. Example, not available in India region. I am based out of India so nearest data center for me is SouthEast Asia referred as SEA in the blog). I will select the SEA datacenter for the creation of SendGrid account on Azure. Therefore first let’s add Resource group in SEA region and then create SendGrid in the same Resource group.

Select “Resource Groups” in left had pane. It lists all resource groups present in your subscription. Click on “Add” button present at the top. Provide the resource group name as SendEmailRG, subscription of your choice and location as SEA as shown below. Click on Create to finish the resource group creation process.



Click on New. You will see a search box at the top. Type Sendgrid on the search box; intelligence will automatically display “Sendgrid Email Deliverys”. Select the same and press enter. Another horizontal blade will open. Select the Sendgrid Email Delivery option and click on Create.

Provide the name of account as “EmailSample”. Provide the password of your choice and remember it. We will need it later. Select the Azure subscription and select existing option for resource group. Select SendEmailRG resource group. This step will ensure that SendGrid account in created in SEA region.

On pricing tier; select the option as Free as shown below –



If you have money feel free to select the other paid pricing tiers.

Provide the contact information. I have provided below. Remember you will receive confirmation email from SendGrid. So don’t put values like below; rather give genuine values.



Select “I give…” checkmark for legal terms and click on Purchase. Then proceed ahead and finish the SendGrid account creation process.

Retrieve essential detail from SendGrid account

To send email we need username, password and SMTP server name. This information is available on Azure portal under your sendgrid account. So go to the SendGrid account we created in above step from Azure portal. Click on Configurations and you will see the required information listed as shown below –


PowerShell to Send Email

Let’s first define the essential parameters based on the information we captured from configurations tab of sendgrid tab –

$Username ="YourUserNameFromPortal"

$Password = ConvertTo-SecureString "YourPassowrdWhichYouHadEnteredDuringSendGridCreation" -AsPlainText -Force

$credential = New-Object System.Management.Automation.PSCredential $Username, $Password

$SMTPServer = "smtp.sendgrid.net"

Any email object requires necessary information like email from, email to, subject line. Let’s add this information as parameters –

$EmailFrom = "No-reply@azureadmin.com"

[string[]]$EmailTo = "YourEmail@gmail.com"
$Subject = "Sending sample email using SendGrid Azure and PowerShell"

$Body = "This is sample email sent using Sendgrid account create on Microsoft Azure. The script written is easy to use."

If you observe, I am using array for $Emailto variable. This is because if I want to add multiple email address into “To” list, I should be able to do it using array. So I want to send email to multiple addresses I will add them as below –

[string[]]$EmailTo = "YourEmail1@gmail.com", “YourEmail2@yahoo.com”

Then use below command to send the email –

Send-MailMessage -smtpServer $SMTPServer -Credential $credential -Usessl -Port 587 -from $EmailFrom -to $EmailTo -subject $Subject -Body $Body -BodyAsHtml

Manage option on SendGrid account

Look at the below screenshot for highlighted option.



This option will help you in generating api key, monitoring of your sendgrid account, failed emails , reports and so on. No this is not another blog post on another day because Sendgrid documentation is detailed to explain all the options. Why you want me to write blog post on the information which is already explained so well?

Note -

Sendgrid provides API Keys which can be used for providing limited access. The above PowerShell code do not use API Key for sending emails. In case you want to use SendGRid API Keys based access control and then send the email using PowerShell; I will suggest to write c# code exe with required parameters and then invoke it using Powershell.  😊


Happy Emailing!!

Tuesday, March 7, 2017

How to download Azure blob storage contents in Azure Linux VM using Azure CLI

Abstract

I always get this question – how can I download Azure blob storage files in Azure Linux VM? When I say use Azure CLI (Command Line Interface) then next question asked is – Do you have step by step guide?
Well, this blog post is the answer to both questions.

The high level approach is outlined below –
  1. Provision Linux Azure VM in a subnet. [Of course this step is out of scope of this article. For detailed steps refer - this guide .]
  2. Install Azure CLI in Linux VM
  3. level Upload sample files to azure storage and then download them in a folder in Linux VM.

This article assumes you understand Azure storage and related concepts.

Wow, this is first blog post from me on Linux and Azure.

Prepare you Azure Linux VM

I have provisioned an Azure Linux VM with OS as CentOs 7.2. Added this VM in a Subnet, with NSG having only port 22 inbound open. Also I have attached a public IP to this VM so that I can make SSH to this VM from anywhere over port no 22. Step by step guide link is already shared above.
If you are having different OS than CentOS then commands in below steps will change however high level approach remains same.

Install CLI in Azure CentOS VM

First make SSH to your Linux VM and run command “sudo su” [without double quotes]. So in subsequent steps we will not face awesome “access denied” or “permission denied” errors. Or we don’t have to add “sudo” word in every command we run.

There are two versions of Azure CLI –
1.0    – This stuff is written in node.js and supports both Classic [old way of doing things on Azure] and ARM [new fancy way of doing things on Azure].
2.0    – To make this version impressive Microsoft calls it “Next generation CLI” and is written in python. Only supports ARM mode.

I will be using 2.0 version. Hence I need Python as well installed on the Linux Azure VM. So let’s first install python latest version on Azure CentOS VM.
Let’s make sure that yum is up to date by running below command –

sudo yum -y update

-y flag tells system that “relax, we are aware that we are making changes, hence do not prompt for confirmation and save our valuable time”. This command execution will take good amount of time.
Next install yum-utils using below command –

sudo yum -y install yum-utils

Now we need install IUS (Inline with upstream stable). Don’t get scared by name. This is community project which will ensure that whatever version we install for Python 3, we will get the most stable version. Run below command to install IUS –
sudo yum -y install https://centos7.iuscommunity.org/ius-release.rpm

After IUS now we can install recent version of Python. As of writing, the recent version is 3.6 but I will install 3.5 to be on safer side. In python 3.5 version I see 3.5.3 is the latest so let’s install it.
sudo yum -y install python35u-3.5.3 python35u-pip

To verify, simple run below command and it’s output should be 3.5.3.
python3.5 -V

Now install the required prerequisites on CentOS using below command –
sudo yum check-update; sudo yum install -y gcc libffi-devel python-devel openssl-devel

Finally, back to installation of CLI 2.0 -


curl -L https://aka.ms/InstallAzureCli | bash
This may prompt you to download CLI in which directory. Press enter to keep the default path of installation which would be “/root/lib/azure-cli”. Similarly keep pressing enter if more prompts are displayed.
Restart command shell to take changes effect –
Exec -l $SHELL
Just type “az”[without quotes] and it should you Azure cli commands information in CentOs. This means installation of Azure CLI 2.0 on Linux is successful.

Run below command to list storage related commands.
az storage -h

Upload sample files to Azure Storage

This step is straight forward. Use Azure portal and create one standard [not premium] ARM based storage account. Create container and upload 4 sample files in the container. It would look like below –






Add Azure account to CLI

Run command  as shown below. It will prompt you with a code and link to enter the code. After this you will be asked for login using existing azure related credentials. Successful login will show you the subscriptions associated to your account as below –



Set storage account and download the blob

Now set credentials for storage account.
export AZURE_STORAGE_ACCOUNT=YourStorageAccount
export AZURE_STORAGE ACCESS_KEY=YourStorageAccountKey

Create a directory named as test1 using the command. This is the directory in which we will download blob contents. -
mkdir test1/

After this run below command to download the blob file in test1 folder
az storage blob download -c sample -n File1.txt -f /test1/File1.txt
az storage blob download -c sample  /test1

Change directory to test by command cd test1 and run ls -l. This should list File1.txt as shown below.

Limitation

Using Azure CLI you can’t download all the blob from a container. You have to download each and every blob individually; bulk download of azure blobs is not supported.

Resolution to Bulk download

To download all blobs from a container instead of Azure CLI, we will need to use Azure XPlat CLI. Or we can also use Powershell as it is open source now [although I have not tried yet]. It’s common to refer many approaches to achieve one task when you are in open source. J
Azure XPlat CLI is a project that provides cross platform command line interface to manage Azure. Refer documentation here - https://github.com/Azure/azure-xplat-cli. But this is another blog on another day.

Conclusion

So I hope now you understand how easy it is to download azure blob storage contents in Linux Virtual Machine.
Please provide your valuable comments. Good news is its free!!
Keep Downloading!!

Thursday, January 5, 2017

Domain join Azure VM using Azure Automation DSC

Abstract

Azure automation has changed a lot since I wrote last blog about AutoShutdown of Azure VMs using Azure Automation. Looking at the phenomenal rate of Azure platform evolution it makes perfect sense to revisit same services and write a new blog with absolutely new feature and tasks.
This article highlights step by step guide to make an Azure VM domain joined automatically using Automation DSC feature. This guide does not cover
-        Step by step flow on creating Azure Automation account in Azure Portal.
-        Azure VM provisioning
-        Domain configurations on domain controller

What is DSC?

DSC stands for Desired State Configuration. It’s a configuration management tool. There are many configuration tools available in the market. Few popular names are Chef and Puppet. DSC is also configuration management tool from Microsoft. Basically, it helps to automate tasks which would be very boring to do manually otherwise.
Example of such a boring task is, domain join the Azure VM when it is provisioned. I am working with one of the customer where almost every month they provision 100+ VMs on Azure and remove them. To satisfy the organization compliance and security policies all VMs should domain joined. Poor IT team had to do this domain joining repetitive task almost every day manually. There was a dedicated team member for this. He was about to go under psychiatric treatment. Thanks to Azure Automation DSC, he is back to normal now.
If interested more in knowing about DSC then link is here - https://msdn.microsoft.com/en-us/powershell/dsc/overview.
Note -
As of today Azure supports Classic(ASM) and ARM (Azure Resource Manager) type of deployments of resources. ARM is the future and this articles talks about ARM based resources only. Provisionof Azure ARM VM and configuring domain controller is out of scope of this article. Refer article - http://www.dotnetcurry.com/windows-azure/1145/active-directory-adfs-azure-virtual-machine-authentication-aspnet-mvc to understand quick steps about domain controller provisioning. The article talks about classic VM provisioning, which you can ignore and directly follow steps from section “Configure Active Directory” to promote the VM as domain controller.

Provision Azure Automation Account

Below link specifies the steps to provision Azure Automation account – CreateAzure Automation account. I am using below values for the same –



In above screenshot, subscription name is blurred; because your subscription name will be different from me and I want to keep it secret for security purpose. sssssshhhh…
New automation account will look as below -



To know about meaning of various options in Automation account like Runbooks, Assets, Hybrid Worker Groups and all refer - https://mva.microsoft.com/en-US/training-courses/automating-the-cloud-with-azure-automation-8323?l=C6mIpCay_4804984382.
As our focus is specifically on writing DSC script to make VMs auto domain join I will not spend time on various concepts and information related to Azure Automation.
With this let’s move forward to actual implementation.

Import xDSCDomainJoin module

xComputerManagement is the DSC module which can be used to make a computer domain joined. xDSCDomainjoin is stripped version of the same. This module is available on PowerShell Gallery. The central repository of PowerShell is known as PowerShell gallery. To know more refer - https://www.powershellgallery.com/.
So this PowerShell gallery has xDSCDomainjoin module and we must first import in our automation account before we use it in our script. The best way to import a module in Automation Account is from Azure Portal.
On the Azure Portal, select your Automation account. The click Assets -> Modules. All existing modules will be shown as below –



Click on “Browse Gallery” option. Search xDSCDomainjoin in the search box and it will be appear as shown below. The click on “Import” and then click Ok to complete importing procedure of module in the automation account. –



A message will appear as “Activities being extracted”. Let this procedure continue. After successful import the assets count will increase by 1 on the main page of Automation account.

Installing Azure PowerShell on local machine

On your local machine/ laptop open PowerShell ISE. You need all Azure PowerShell commands available on your local machine. Working on Azure without PowerShell is like Superman without Powers (or underwear…). Therefore, first install Azure PowerShell as per the guide given here - https://docs.microsoft.com/en-us/powershell/azureps-cmdlets-docs/#install-and-configure.

Writing DSC script for domain join

Now after installation first we must provide authentication information of Azure account to current open PowerShell ISE window. For this run the command –
Add-AzureRmAccount

This will prompt for login. Go ahead and login to complete the authentication.
Create new file in PowerShell ISE and save it as DomainJoinConfiguration.ps1. Write below PowerShell in the same file –
#first import below configuration in Azure automation account xDSCDomainjoin

Configuration DomainJoinConfiguration
{   
    Import-DscResource -ModuleName 'xDSCDomainjoin'
   
    #domain credentials to be given here   
    $secdomainpasswd = ConvertTo-SecureString "YourDomainPassword" -AsPlainText -Force
    $mydomaincreds = New-Object System.Management.Automation.PSCredential                       ("UserName@Domain", $secdomainpasswd)
   
        
    node $AllNodes.NodeName   
    {
        xDSCDomainjoin JoinDomain
        {
            Domain = 'YourDomain'
            Credential = $mydomaincreds
           
        }
    }
}

In above script replace YourDomainPassword, UserName@Domain, 'YourDomain' values by your own values. This is your final script to make an Azure VM domain joined. Now we must upload this file on Azure automation account. Therefore, click on DSC Configuration -> Add Configuration as shown below –



On the next window upload the file we created in above step and then click on Ok to complete the configuration of domain join DSC. And yes, please provide some meaningful description as shown below -




Why password in plain text?

In above script, you must have observed below line -
$secdomainpasswd = ConvertTo-SecureString "YourDomainPassword" -AsPlainText -Force

This forces to keep the password as plain text. As you have guessed this is not good practice. But I am not going to leave it here. Please read out next sections to understand why we are keeping the password in plain text. So, hold on your emotions.

Adding Configuration Data to DSC script of Domain Join

Configuration data allows you to separate structural configuration from any environment specific configuration while using PowerShell DSC.

This way, we want to separate “WHAT” from “WHERE”. DSC script we have written above specifies the structural configuration (what). This is where we define “What is needed” and does not change based on environment. Irrespective of environment; whether development or production, we want VMs to be Domain Joined. Environmental configuration specifies the environment in which the configuration is deployed (where). For example, we need common settings for all nodes and specific settings for specific nodes.

To specify environment configuration, we use “Config Data” and then we compile entire DSC script using config data. This should contain a key “All Nodes” where you specify all common configurations for all nodes that wishes to get domain joined automatically and then it can contain other node specific keys. By the way, Azure VMs we add to DSC configuration are termed as “Nodes”.
For all nodes, I want to allow “Plain Text Password” and “domain user credentials” and specific nodes I want domain joined. Therefore, we will write config data as –

$ConfigData = @{
    AllNodes = @(
        @{
            NodeName = "*"
            PSDscAllowPlainTextPassword = $True
            PSDscAllowDomainUser = $true
           
        }
        @{
            NodeName = "DomainJoined"
        }
    )
}

This configuration data I will need use to compile my DSC script.

Compile the DSC configuration

The domain join DSC script has been added to Azure automation account and now it is time to compile it so that .MOF file will be generated on Azure Pull Server. Once .MOF is generated all DSC nodes added to automation account receives configuration from the same .MOF file. If you open the DSC configuration, you will observe that “Compile” button is available at top in the portal itself.
However as of today, if you have DSC script and you wish to get compiled using Configuration data then PowerShell is only option. If you compile DSC script with config data using portal, you will receive errors. So, we will write compilation script and pass above mentioned configuration data, to compile DSC script present in Azure Automation, from local machine.
To compile DSC script present in Azure Automation account from your laptop, you will need credentials. This is where Service Principal helps. When we create Azure Automation account, an Azure Active Directory service principal automatically gets added to your Azure AD tenant under which current subscription is present. To verify, just go to Azure AD on Azure Portal and then select “App Registrations”. You will see the automation account added as web app.



Every app registered in Azure AD has application Id – this is your username.



Every app registered in Azure Ad has secret key – this is your password.



Specify duration as per your choice and give logical name to key as “KeyForDSC” and then click Save. This will automatically generate key. Secret key is visible only once, after which it won’t be visible on the portal. So make sure you get it stored in nice place instead of desktop for future use.
Azure AD tenant has unique id – this is your tenant id.



We need use Login-AzureRmAccount command and use above credentials to start a compilation job for DSC script in Azure Automation account from local machine. The complete script of compilation job with config data and service principal credentials is as below –
$ConfigData = @{
    AllNodes = @(
        @{
            NodeName = "*"
            PSDscAllowPlainTextPassword = $True
            PSDscAllowDomainUser = $true
           
        }
        @{
            NodeName = "DomainJoined"
        }
    )
}

$secpasswd = ConvertTo-SecureString "YourSecretKey" -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ("YourApplicationId", $secpasswd)
Login-AzureRmAccount -ServicePrincipal -Tenant "YourADTenantId" -Credential $mycreds

$compilationJob = Start-AzureRmAutomationDscCompilationJob -ResourceGroupName 'YourResourceGroupName' -AutomationAccountName 'YourAutomationAccountName' -ConfigurationName 'DomainJoinConfiguration' -ConfigurationData $ConfigData
$compilationJob

Run above command and compilation job will start. Once completed, you can view the job status as below –



This completes the configuration DSC domain join in Azure Automation account.

Password in Plain Text

In both, DSC and compilation script above we used password in plain text.

The best way to protect the password is to use certificate based encryption. If you need details around the same then refer - https://blogs.technet.microsoft.com/ashleymcglone/2015/12/18/using-credentials-with-psdscallowplaintextpassword-and-psdscallowdomainuser-in-powershell-dsc-configuration-data/.
However, even if you keep the password in simple text in script it is fine. Because, the .MOF file generated from compilation of DSC script in Azure is always encrypted. But Automation doesn’t know that Azure is going to keep the entire .MOF file encrypted and it throws error. Therefore, we must explicitly specify the force keyword in PowerShell script. So, if you are sure that no untrusted admin will access automation account from azure portal then you can always keep the credentials in plain text. Choice is yours!

Add DSC node to make domain join – finally!

On the Azure portal, under automation account click on DSC Nodes -> Add Azure VM -> Virtual Machines. This will list all VMs present in the azure subscription. Select the VMs of your choice to get them domain joined.
Under Registration tab, Select Node configuration name as “DomainJoinConfiguration.DomainJoined” and provide other information as shown below.



After a while, selected node will display the information as Compliant and VM will show as domain joined.

Bonus Tip - To join VM to specific OU

This is simple. In the main DSC script you will need to provide your OU information if you want all VMs to be joined to specific OU. Example below -
xDSCDomainjoin JoinDomain
{
Domain = $Domain
Credential = $Credential # Credential to join to domain
JoinOU = "CN=Computers,DC=someplace,DC=kunal,DC=gov,DC=au"
}
Sometimes you may face errors related to domain credentials in log generated out of DSC job. In such case instead of specifying user@domain you can also specify domain\user format for credentials in DSC script.


Conclusion

I hope this post have helped you to save you from very repetitive tasks of domain joining a machine.
Please provide your valuable comments. Good news is its free!!

Keep Automating!!