Mail Sender Policy Framework (SPF) with SendGrid and O365

TLDR; add txt v=spf1

1. Add TXT Record in your domain.

2. In TXT content add the following where <domain> is a domain that is allowed to send email on your behalf.

v=spf1 include:<domain> –all

Example for office 365:

v=spf1 –all

3.  for multiple records, use space + another include:<domain>

v=spf1 include:<domain><space>include:<domain><space> –all

Example for office 365 with Send Grid (All):

v=spf1 –all

I would consider that having the entire domain may not be a good practice, follow the steps below on having only use your own sendgrid account listed in your SPF record.

Long Version:

So, yes, email spam. I have a new project that will require sendgrid and as I remember, one of the challenges in using a third party mailing service is that your email may end up on spam semi-defeating the purpose of your email sender. Since Email on the internet can be forged in a number of ways (RFC7208)

So according to send grid you need to do 2 things (1 & 2): #3 here is for working with another service that uses your domain and #4 is for testing. 

1. Domain Verifications –> which you can do at (please login) where you add a bunch of CNAMES in your domain DNS that points back to

Please note of your record. You will be needing that on step #2



2. Add SPF on your domain.

On your domain registrar, add a new TXT record with the value:

v=spf1 include:<domain> –all

The <domain> here is the mail sender.

According to SendGrid documentation, you need to add the – for example:

v=spf1 –all

You can find your “”  during your domain validation.

3. If using with Office 365.

So this domain is linked to an Office 365 E3 account which means it has an Exchange Online service. In order to use both O365 and Sendgrid together, both domains must be appended as an include in your SPF TXT record with this format:

v=spf1 include:<domain><space>include:<domain><space> –all

For an example:

v=spf1 -all

4. Testing. There are a number of domain tools out there. I use ol’reliable MX Toolbox:


On the dropdown, choose SPF Record Lookup. Enter your domain and check if your SPF is already there. This is an example result of a configured SPF with Outlook and SendGrid.


So hey, stop spamming and use SPF. Try it out.


PS/ On the mail internet headers (Email/Outlook > File > Properties) You should see the following:


Green highlight is the domain used in SendGrid.


Default Azure Pipeline yaml as task

A quick note on Azure Pipelines. If you create a pipeline thru Azure DevOps on a .NET Core App the yaml file will use script rather than tasks.


This is also used in the Azure Pipelines Yaml Templates Github Repo.


It works yes but I would rather use tasks. If you are interested to use that as well, head on here:


Azure IoT Hub with ESP8266 and automated plant watering with soil moisture sensor

They said that taking care of plants relieves stress [1]. But no one told me that I cannot automate this.


So I created this.


Its an IoT based automated plant watering system with soil moisture sensor and monitoring system. This project uses Azure IoT Hub, Azure Streaming Analytics and PowerBI for dashboard. Data is sent by the NodeMCU / ESP8266 Arduino. Data is from a collected from a Soil Moisture sensor which controls a submersible pump thru a relay module. The relay turns on the pump depending on the amount of reported soil moisture and waters the plant from a water source. Reported soil moisture is also shown in the 20×4 LCD and is sent to the Azure IoT hub over the internet.

Lets get started!

For software you would be needing an Azure Subscription and A PowerBI Subscription. Hardware components are discussed on the hardware section. You may want to jump there to see if you have the right parts for this project.

We will first configure the cloud based resources needed for this project. We would be needing just the Azure IoT hub for now and streaming analytics.

First is to create a Resource Group

Azure IoT Hub

Then inside the Azure resource group, create a new IoT Hub. IoT hub controls and receives messages from our IoT devices. Start by clicking the add icon.


Search for IoT Hub. Then click create.



Choose the newly created Resource Group and remember the IoT Hub name you specified here. You will be needing it later.


IoT hub has Freeee tier. Lets use that for now. Note that in production setup, you may want to review the scale tier and messages per day.


Once created, it would look like this:


Now lets create a device in the IoT Hub. On the explorers tab, click IoT Device then click + New icon.


On the create a device, make sure you have the auto-generate keys checked. Specify a unique device name and then click save.


After adding a device, open Azure Cloud Shell located at the top right corner of the page. We will use PowerShell.


Once open add the azure iot cli extentions by executing:

“az extension add –name azure-cli-iot-ext”


Then using the Azure IOT CLI extension, lets create a SAS Token. A Shared Access Signature (SAS) Token is used by a physical device to authenticate to Azure IoT Hub. IoT Hub then checks the token and authorizes the device to send data.

To generate a SAS token use:

“az iot generate-sas-token –d <devicename>– n <iot-hub-name>”


Replace <devicename> with the device name that you specified during device creation and the <iot-hub-name> with the name of your IoT hub. SAS Tokens looks like this:


It is important that you save the result SAS token.

Lets test the SAS Token and the IoT Hub. You may use other mock or API testing software. For now, lets use postman.


Perform an HTTP Post to this URL, replacing the <iot-hub-name> with your IoT Hub Name and the <device-name> with your newly created device.


In the header, add an Authorization key and for its value, use the Shared Access Signature that was returned during the SAS Token Creation in Azure CLI.

For the body, create a simple JSON that has some payload. In my payload I have “smc” or soil moisture content of 80 value and “idn” or device identification which I specified as “device5” again. Do not forget to make it as a “raw” – “json” request and then hit send. You should be able to get a 2XX response from Azure IoT hub.


Stream Analytics Job

Our next is to create a stream analytics job. A stream analytics “accepts” or ingests data, streams it real-time to dashboards or any data at rest storage. For our project, as highlighted here in this figure that data will be coming from IoT Devices thru the IoT Hub. Streaming analytics then delivers our data into a Power BI dashboard in real time.


Same as IoT Hub, search and create for Stream Analytics Job


Choose a name for your new Stream Analytics Job, make sure your location is as close as possible and click create.


Once created we need to configure an Input to be able to ingest some data. So on Job topology, click Add a Stream Input and choose IoT Hub.


Add an alias and select the IoT Hub from your Azure subscription, click save after.


Now lets deliver data from IoT Hub to PowerBI. Still on Job Topology, click output and select PowerBI.


Click Authorize and sign-in to your account. Few window will pop-up, follow the wizard until you are already signed-in.


This part is important, It was hard for me at first and tried to figure out how this works but for now you must choose “user token” authentication mode. This is so that the group workspace shows includes the “my workspace” in the dropdown. Add an output alias, a dataset (database) name and a table name. Remember your values here and then click Save.


Now lets tie them together. From the Job Topology, click query. Use a query that selects all inputs into your output from your input like so. Save the query. 


Check your query in the overview and you may start the Streaming Analytics at this point.


Important: Using Postman, send a couple of requests for you to see if there are data coming from Azure IoT hub to Azure Streaming Analytics. Click test query to check if there are Json payload.

Power BI

Login to your Power BI account with the same account used during the authorizing process in Azure Stream Analytics. Expand My workspace and you should see your datasets under the datasets. Remember the  dataset (database) name and a table name used in streaming analytics? This should be the same.


Click your dataset and there fields of the table would appear on your right hand pane.


You may now be able to design your report. Add Event enqueue and the sum of SMC in the fields selection. Select an appropriate visualization such as a stacked area chart here.


Save your report.


Once saved, you may now create a new dashboard. Pin this report to a new dashboard and then click pin live.



You can also add more tiles in the dasbhoard by clicking edit and then add tile.


Select Custom Streaming Data and then click Next.


Your dataset should be available here. Select and click next.


Choose your visualization type, for this example, lets choose a gauge type.


On the tile details, use Soil Moisture Content and then click Apply.


Power BI is also available as an IoS App in the Apple Store.

Here’s my running dashboard for device5. I plan to use more devices, sensors and pumps. And that’s for another blogpost. Until then, happy coding!

No description available.

To test real time integration, try and posting data using PostMan and send data to the streaming analytics. We should be able to show data in the Dashboard.

No description available.


We have configured the IoT back-end and dashboards for our project as shown as in the red dashed box in this figure.


Our next is to set-up the Arduino and Hardware part of our project. 

Arduino Hardware

So for our bill of materials:

  1. ESP8266 / NodeMCU Arduino
  2. Node MCU Base Board
  3. LCD Display 20×4 I2C
  4. HW-080 Soil Moisture Sensor
  5. Jumper Wires Pack (M-F / F-F)
  6. Relay Module
  7. 5v Power Adapter
  8. Used USB Power Adapter and Cord
  9. Submersible Pump

I got most of my Arduino parts from the following stores here in Philippines.

Lets get started on assembling our hardware:


Follow this pin-out. Note that the soil moisture uses analog.


Instead of using a breadboard, I used a NodeMCU base board. This is so that I can power V5 PowerSource plus once completed, the project can already be deployable since the baseboard is more sturdy than a breadboard.


Connect the submersible pump to your relay. Use the normally closed on your relay. The relay turns connects the power or cuts off the power going to the submersible pump. Here’s my wiring:


At this point, the LCD can also be connected. Pin configuration is available in the diagram. Also attached the components with some glue stick to make it sturdy and easy to work with.


Casing is just a soap dispenser. Bought from SM Makati Smile (they got it all for youuuu…) bathroom section, near pillows 4th floor. Drilled some holes in the soap dispenser that would fit wiring and LCD screen. Since the NodeMCU needs power, a hole was also drilled on the side.

Here’s some measurements that I used.


Arduino Software / Firmware

I have created a simple repository that contains the relevant source codes for running this. Please do check-out this github link or directly to the storage.

Checkout the code inside the loop.


For Wi-Fi and IoT I used ESP8266-SimpleWifiManager that is available in Github.

Integrating Arduino with Azure IoT Hub

For sending data to the cloud I used HTTP

Here’s the relevant code that sends data to Azure IoT Hub.


First is the SSL Fingerprint. As of writing, the HTTP Client that I am using seems to need the SSL fingerprint. There are a number of ways out there like from the following sites. Use that to begin your HTTP Client instance.

The server address is the one you were given in creating the IoT Hub and adding a device. In my example, I already replaced <device-name> with device5.


Also add the SAS as a value of the Authorization Header as you did in postman.

Perform an HTTP post with a formatted payload with your soil moisture and other information, mine, I added the last pump interval. I measure the amount of time the pump has been running. Why you say? Just wanted to make sure my plants are not drowning. Smile

The Completed Project

Horay! Lets run this. Closing the lid and powering on ESP8266 and the Submersible pump. Also soil moisture is being shown in Power BI Dashboard in mobile.


When soil moisture is below the threshold, ESP8266 turns on the pump and waters the plant. Also shows in LCD like here:


No description available.

So that’s it! Thats my pet project for the weekend. Hope this was informative and had much fun as I did.

In conclusion, watering plants can relive stress. Automate it, put some IoT and Azure, now fun begins!



Ahh before you leave and trying this out, make sure you have enough credits to support your Steaming Analytics Job. Would cost you a bit. So here is my bill after a while:


Second is that I am not an electrical engineer. My background is IT and Software. My last experience in electronics is back when I took Industrial Electro Mechanics in Don Bosco Canlubang. Other than that, I usually break electronics at home. Soooo.. Please check your wiring and double check sources. (As any other source in the internet, fact check people!). So I provide no warranties whatsoever and use at your own risk.

Its a prototype, a hobby build. Do not use in production.

Finally this is more of an opinionated approach. Thoughts of my own and I do not represent any institution or my employer.



Managed SSL in Azure

So I have a website hosted in an Azure Web App. Certificate expired and wanted to try out other SSL providers out there. Good thing that it seems Azure has already a Managed Certificate Service which is currently on preview. I wanted to try it out and share my experience here.


1. Create App Service Managed Certificate over TLS/SSL Settings > Private Key Certificates of your Web App.

2. Bind your Managed Certificate to the Web App.

3. Optional but recommended, redirect to HTTPS only.

Overall, its a more pleasant experience and I consider this as an upgrade from my 4 part HTTP-SSL blog where I attempt to get SSL from Digicert, install the SSL and do some workarounds on auto redirection. That post is here if you are still interested:

So, lets get started!

Open your Azure Portal and go to your WebApp and on the settings pane search for TLS or SSL. On the TLS/SSL settings page click Private Key Certificates and then Create App Service Managed Certificate.


On the Create App Service Managed Services pane, dropdown the app service host name that you wanted SSL for. Mine is my so after the validation you will be able to press the create button.


Wait for the certificate to be created..


Once created, it will be available at the Private Certificates Table below.


You can click the certificate to check the details which includes the expiry date. Self note, don’t forget this time to renew clip_image006


Now to actually use the certificate, click Bindings tab of the TLS/SSL settings and click Add TLS/SSL binding. Choose the domain you are assigning the private certificate and the certificate.


After adding it should appear as a new binding with hostname on the table below.


As an optional but I do recommend, use HTTPS only. It should always redirect your site to HTTPS since you already have your shiny new SSL certificate on your site.


I am using Chrome to test and it took a few hard refresh to see the new certificate. Also checked the SSL certificate and its good. See that lock?


Saw that its GeoTrust –

More info could be found here:

And thanks to Miguel and Azure Pilipinas FB page for the link!

On HTTP/2 today

Its 2018 and its a long overdue performance upgrade from the 1999 HTTP 1.0 and 1.1 Specs.  Its promise is to reduce the perceived load time in user perspective and also have more efficient use of server and network usage.

So lets try out and see if we could have more optimized results on one of my development environment.

The test would be using a single website, cache disabled using Chrome so that we could have a measuring tool for performance. We would perform two tests, before and after we turn-on the HTTP/2 version on our web servers.

My development site using HTTP 1.1


Same development site upgraded to HTTP/2:


Response time results: HTTP/2 Performs better in returning individual resources from the server.

Type HTTP 1.1 HTTP/2
Initial Document HTML 346 MS 57 MS
SVG Image XML 165 MS 47 MS
Site.min.js JS 65 MS 43 MS
Site.min.css CSS 57 MS 48 MS
Select2.min.js JS 85 MS 84 MS
Select2.min.css CSS 121 MS 49 MS

Size results: HTTP/2 Do compress the content, this shows that files served thru HTTP/2 is lower in size.

Type HTTP 1.1 HTTP/2
Initial Document HTML 2.6 KB 2.4 KB
SVG Image XML 1.4 KB 1.2 KB
Site.min.js JS 603 B 203 B
Site.min.css CSS 1.4 KB 1.1 KB
Select2.min.js JS 25.1 KB 24.8 KB
Select2.min.css CSS 3.1 KB 2.8 KB

Domain sharding Results: Need more verifications (Am sleepy na)

According to the previous spec (8.1.4 Practical Considerations) a client should not maintain 2 connections with any server or proxy. Thus we had to do a workaround where resources are located on a different server (Or in our case, different DNS name or using a Content Delivery Network). Now from our results:


I am thinking about not using CDN anymore if they are still using SPDY or 1.1, however looking at the results, cached:


Enabling HTTP/2

How did I enable HTTP/2. This site is hosted in Azure as an app service. To enable, search for the HTTP version on the Application Settings. Click save and refresh.


For the on-premise-premise, HTTP/2 is also available on Windows Server 2016/IIS 10 and Windows 10.

Azure Mobile App

Currently OOF with only an IPhone. So while waiting here, I was browsing apps and checked out of curiousity if there is a mobile app for Azure.

There is.

Downloaded this and signed in using my live account.

This will open a new web view pop-up.

After logging in you will be redirected to your home page. I can also filter my services.

Hamburger menu opens your profile and will display your directories with subscriptions.

And heres the details of my app. Happy to see its running without any errors 🙂

Heres a sample of the app insights for one of my web app.

This is so cool this app goes into my first screen.

So thats it for now! Coolness!

Moving to SSL / HTTPS

Recently I have walked the talk and have moved my personal site to HTTPS.

Although I have already moved, redirected and configured many many web front end to use SSL, I haven’t got around to implement this to my own websites. In comparison, my site is not a transactional site or doing any registration – I only use this as my portfolio site as well as a live test environment where I can experiment, learn, validate and do pretty much everything without any impact to anyone but me.

There are a lot of articles here, herehere, here, here and there regarding the pros and cons of having a site over HTTPS.  Basically from what I am reading now is it has an additional cost and additional load but it has to be done.

And thanks to modern tech, the move is fairly easy:

  1. Choose your CA. – Validation and Order
  2. Create CSR – Using a tool or MMC / Inetmgr
  3. Install PFX to your website. – Azure Website Basic Tier and Above.
  4. Auto redirection – using IIS URL Rewrite Rules (Azure) with a demo of TFS Online 🙂

So here’s my contribution to the secure modern web! Happy SSL!


So this exercise got me thinking, we are really in the age of the cloud service already. From requesting certificates to installation, scaling my application and even a source code rebuild-test-deploy scenario and I haven’t touched not a single MMC or any server directly. The old concepts are there from web deploy about file being used or using a IIS manager to request for a CSR and fulfilling the certificate request but in a modern way. Difference is I used to do MSTSC but now, I am talking to the web browser. This could have taken days to do or even weeks not to mention there would be misconfiguration from my end but now, I am up and running “as I wish”.  Hmm. 🙂

Moving to SSL / HTTPS-PART 4

“We deprecated the hosted XAML build controller on July 1st 2017. We recommend that you migrate to our new build system. However if you still need to run XAML builds during the migration then you must set up a private XAML build controller now”.

Yes yes. I forgot to upgrade. Lets move on.

So in order to do publish, we just need to login to our account and go to the project that we need to publish.

There is a tab called Build and Release, and there should be an Azure web app template.


Once applied, you need to first choose which Solution to build and deploy, kinda like WEBDEPLOY before.


Then we need to link our azure account and then choose which app service to deploy on.


The link happens when you authorize your visual studio by logging in to your azure account. Note that this is a pop-up.


Then click refresh if you dont see your app service on drop-down.


Then viola, you can now save or save and then already queue for deployment.


This should queue up and warm up an available agent again, like WEBDEPLOY before.


Once the Agent fires-up the deployment, you will notice that the scripting engine and console is going to be shown and you will see the progress of this.


Aha! You are still using WEBDEPLOY! Long live web deploy!


NOOOOOOO! Okay, new Relic is giving me a bump. Like the old WEBDEPLOY, file is in used so therefore you cant override and your deployment task will fail.


As I remember, its just as easy as:


Or we could just easily do a slot deployment and switch slots after . I just remembered that I am on B1 tier in Azure. There is no slot deployment for that! Great.

I remembered, this is my PERSONAL site, no one visits this or any use of this. Lets just stop the site.

So lets do this, lets insert two deployment task in the build definition. One to stop and one to start, effectively a sandwich before and after deployment. So add an Azure App Service Manager task.


The first one, stop the App Service. You know which subscription and app service to stop.


After the Azure Service Deployment task, we should start the service.


Lets try it out, save the build definition and queue build!


Aha! Stop worked!


Publishing.. Yes!


Build says its okay and was deployed successfully.


This got me thinking, we are really in the cloud already and from requesting certificates to installation, scaling my application and even a source code rebuild-test-deploy that I haven’t touched not a single MMC or any server directly.

Moving to SSL / HTTPS-PART 3

Azure Websites Basic Pricing Tier (SSL Support)

So you now have an SSL Certificate? Lets install it to your Azure Website. I distinctly remember, in order for you to have a custom domain (without the, you have to be in the D1 Shared instance in which I am right now.

So from D1 Shared, I upgraded to B1.



Once upgraded, you can now go the SSL settings. You can search it thru the web app settings and in there, click Upload Certificate.


Now remember the PFX file that we created on the earlier part? Use that and use the password that we added when we exported the PFX.




Still within SSL settings, we now have to bind the uploaded SSL with the domain that we want to secure. Click SSL Bindings.


Choose the SNI SSL after using the hostname and certificate name combination. Then click Add Binding.


So that’s it, in just 3 easy steps we already have a working SSL Certificate bound to our site.


Now to check, lets go to using chrome and IE.


Valid certificate! Sweet!


But our old http only site is still active. So we may need to automatically redirect visitors from http to https. Rewrite should do this. Lets edit web.config!

So my TFS Online is linked to my Azure Websites. I already have a redirect before and should be a fairly easy web.config change, build deploy.


Oh no. I got a message: “We deprecated the hosted XAML build controller on July 1st 2017. We recommend that you migrate to our new build system. However if you still need to run XAML builds during the migration then you must set up a private XAML build controller now”.


I cant believe I never got around to update my own build! Okay, no time to waste, lets just create a new build definition. Stay tuned for part 4.

Moving to SSL / HTTPS–PART 2

On this Part 2: We are going to get our CER and PFX to be used for Azure.

Create Certificate Signing Request

There is a tool available thru Digicert website or you can do it manually over IIS. Since my target is to install this in Azure, I chose to use the tool they provided.


Lets use a Windows PC. So I will not have IIS Manager access to my Azure Website so we need to generate the certificate and then install it.


Download the tool, extract and run.



Click the SSL Certificate tab and click Create CSR.


This reminds me of the IIS Manager Create Certificate Request action but it should be straight forward. Click SSL and then make sure that your info is correct. Then click Generate.


Then copy the result to a notepad or clipboard can be enough.


Login back to your Digicert account and click the status of your order. There should be a Pending CSR there.


This opens up a pane and you can paste the CSR here.


I chose IIS 10 and then clicked continue.


Then viola, CSR Completed. This will then trigger an email where your .CER will be attached.



Unzip this to get the .CER and some instructions.


Go back to the DigiCert certificate tool and then import the CER. You need to get the *PFX out of this CER.


Once you clicked next, just enter your friendly name and then finish. It should show on the utility.


Like this:


Now lets export the PFX, just highlight the certificate and then click Export.


Export the private key, use PFX and all path if possible. Click Next.


Yes, like the MMC, you need to provide a password since you are exporting the private key as well.


Then save the PFX File to a location where you will pick up to install in AZURE.


You can now close this tool. Thanks DigiCert!