Mail Sender Policy Framework (SPF) with SendGrid and O365

TLDR; add txt v=spf1

1. Add TXT Record in your domain.

2. In TXT content add the following where <domain> is a domain that is allowed to send email on your behalf.

v=spf1 include:<domain> –all

Example for office 365:

v=spf1 –all

3.  for multiple records, use space + another include:<domain>

v=spf1 include:<domain><space>include:<domain><space> –all

Example for office 365 with Send Grid (All):

v=spf1 –all

I would consider that having the entire domain may not be a good practice, follow the steps below on having only use your own sendgrid account listed in your SPF record.

Long Version:

So, yes, email spam. I have a new project that will require sendgrid and as I remember, one of the challenges in using a third party mailing service is that your email may end up on spam semi-defeating the purpose of your email sender. Since Email on the internet can be forged in a number of ways (RFC7208)

So according to send grid you need to do 2 things (1 & 2): #3 here is for working with another service that uses your domain and #4 is for testing. 

1. Domain Verifications –> which you can do at (please login) where you add a bunch of CNAMES in your domain DNS that points back to

Please note of your record. You will be needing that on step #2



2. Add SPF on your domain.

On your domain registrar, add a new TXT record with the value:

v=spf1 include:<domain> –all

The <domain> here is the mail sender.

According to SendGrid documentation, you need to add the – for example:

v=spf1 –all

You can find your “”  during your domain validation.

3. If using with Office 365.

So this domain is linked to an Office 365 E3 account which means it has an Exchange Online service. In order to use both O365 and Sendgrid together, both domains must be appended as an include in your SPF TXT record with this format:

v=spf1 include:<domain><space>include:<domain><space> –all

For an example:

v=spf1 -all

4. Testing. There are a number of domain tools out there. I use ol’reliable MX Toolbox:


On the dropdown, choose SPF Record Lookup. Enter your domain and check if your SPF is already there. This is an example result of a configured SPF with Outlook and SendGrid.


So hey, stop spamming and use SPF. Try it out.


PS/ On the mail internet headers (Email/Outlook > File > Properties) You should see the following:


Green highlight is the domain used in SendGrid.


Plus Addressing in Outlook and O365

Plus addressing is adding a “filter” or a suffix or tag to your existing email address so that you can filter incoming email by making your own email dynamic.

Lets say you have as your primary email, you can use john+<suffix> or and use that to register to netflix. Emails sent to still delivers to your mailbox.

For an instance, you want to sign-up for a subscription. Do not just put your whole email address. Try adding “+subscription” where the name of the subscription that you are registering to.


Once a mail arrives in your mailbox, it should be something like this. Notice also that the “to:” field contains the suffix or keyword that you used.


What are the possible use cases and possibilities?

  • For one, it helps you track down which company, spammer, subscription that your email came from. Even if they “sell” your email, and the “from” changes, the sub / tag / will still be there.
  • Use mail filter, automatic spam? Categorize? Move to folder.
  • Have multiple email addresses.
  • Alternative to disposable emails. Long stretch but possible, right?

Geek section!

This is actually an implementation of 2004 RFC3696 or better known as “Application Techniques for Checking and Transformation of Names

Lookup section #3 for Email Addresses and #4.3 for Mail-To URL’s


So for the longest time, this has been available in other public emails like in Gmail, its called “task-specific email addresses”

Or something like this from 2008:


Exchange and Office 365

Office 365 did not support this. Well at least until Sept. I was waiting for this feature since it was announced or at least I saw on uservoice that it will be available Q4 of 2020. It may have seemed to arrive earlier or on time for Q4 . So revisiting this and applying to my own O365 E3 Tenant.

To implement on O365:

On my freshly updated PC (with internet connectivity):

1. Run PS in Admin

2. Set-ExecutionPolicy RemoteSigned

3. Install-Module -Name ExchangeOnlineManagement

4. Import-Module ExchangeOnlineManagement

5. Connect-ExchangeOnline –UserPrincipalName <user>@<o365domain>.com -ShowProgress $true

6. Set-OrganizationConfig -AllowPlusAddressInRecipients $true


Lets try it out on my Office 365 Email: Suppose I wanted to use my email to re-subscribe to coursera:

I use


It arrives to my office 365 mailbox as:


PS. Apologies for the redaction. Some re-bloggers and bots are mining my blog pictures and screenshots may end up on other sites that shows my naked email. But I do hope it came across.

PPS. I got a few spam mails and I this is why I revisited this in the first place. Hoping that I can at least help exchange mailbox filter out nasty emails. Again, apologies for the redaction.

Default Azure Pipeline yaml as task

A quick note on Azure Pipelines. If you create a pipeline thru Azure DevOps on a .NET Core App the yaml file will use script rather than tasks.


This is also used in the Azure Pipelines Yaml Templates Github Repo.


It works yes but I would rather use tasks. If you are interested to use that as well, head on here:


Azure IoT Hub with ESP8266 and automated plant watering with soil moisture sensor

They said that taking care of plants relieves stress [1]. But no one told me that I cannot automate this.


So I created this.


Its an IoT based automated plant watering system with soil moisture sensor and monitoring system. This project uses Azure IoT Hub, Azure Streaming Analytics and PowerBI for dashboard. Data is sent by the NodeMCU / ESP8266 Arduino. Data is from a collected from a Soil Moisture sensor which controls a submersible pump thru a relay module. The relay turns on the pump depending on the amount of reported soil moisture and waters the plant from a water source. Reported soil moisture is also shown in the 20×4 LCD and is sent to the Azure IoT hub over the internet.

Lets get started!

For software you would be needing an Azure Subscription and A PowerBI Subscription. Hardware components are discussed on the hardware section. You may want to jump there to see if you have the right parts for this project.

We will first configure the cloud based resources needed for this project. We would be needing just the Azure IoT hub for now and streaming analytics.

First is to create a Resource Group

Azure IoT Hub

Then inside the Azure resource group, create a new IoT Hub. IoT hub controls and receives messages from our IoT devices. Start by clicking the add icon.


Search for IoT Hub. Then click create.



Choose the newly created Resource Group and remember the IoT Hub name you specified here. You will be needing it later.


IoT hub has Freeee tier. Lets use that for now. Note that in production setup, you may want to review the scale tier and messages per day.


Once created, it would look like this:


Now lets create a device in the IoT Hub. On the explorers tab, click IoT Device then click + New icon.


On the create a device, make sure you have the auto-generate keys checked. Specify a unique device name and then click save.


After adding a device, open Azure Cloud Shell located at the top right corner of the page. We will use PowerShell.


Once open add the azure iot cli extentions by executing:

“az extension add –name azure-cli-iot-ext”


Then using the Azure IOT CLI extension, lets create a SAS Token. A Shared Access Signature (SAS) Token is used by a physical device to authenticate to Azure IoT Hub. IoT Hub then checks the token and authorizes the device to send data.

To generate a SAS token use:

“az iot generate-sas-token –d <devicename>– n <iot-hub-name>”


Replace <devicename> with the device name that you specified during device creation and the <iot-hub-name> with the name of your IoT hub. SAS Tokens looks like this:


It is important that you save the result SAS token.

Lets test the SAS Token and the IoT Hub. You may use other mock or API testing software. For now, lets use postman.


Perform an HTTP Post to this URL, replacing the <iot-hub-name> with your IoT Hub Name and the <device-name> with your newly created device.


In the header, add an Authorization key and for its value, use the Shared Access Signature that was returned during the SAS Token Creation in Azure CLI.

For the body, create a simple JSON that has some payload. In my payload I have “smc” or soil moisture content of 80 value and “idn” or device identification which I specified as “device5” again. Do not forget to make it as a “raw” – “json” request and then hit send. You should be able to get a 2XX response from Azure IoT hub.


Stream Analytics Job

Our next is to create a stream analytics job. A stream analytics “accepts” or ingests data, streams it real-time to dashboards or any data at rest storage. For our project, as highlighted here in this figure that data will be coming from IoT Devices thru the IoT Hub. Streaming analytics then delivers our data into a Power BI dashboard in real time.


Same as IoT Hub, search and create for Stream Analytics Job


Choose a name for your new Stream Analytics Job, make sure your location is as close as possible and click create.


Once created we need to configure an Input to be able to ingest some data. So on Job topology, click Add a Stream Input and choose IoT Hub.


Add an alias and select the IoT Hub from your Azure subscription, click save after.


Now lets deliver data from IoT Hub to PowerBI. Still on Job Topology, click output and select PowerBI.


Click Authorize and sign-in to your account. Few window will pop-up, follow the wizard until you are already signed-in.


This part is important, It was hard for me at first and tried to figure out how this works but for now you must choose “user token” authentication mode. This is so that the group workspace shows includes the “my workspace” in the dropdown. Add an output alias, a dataset (database) name and a table name. Remember your values here and then click Save.


Now lets tie them together. From the Job Topology, click query. Use a query that selects all inputs into your output from your input like so. Save the query. 


Check your query in the overview and you may start the Streaming Analytics at this point.


Important: Using Postman, send a couple of requests for you to see if there are data coming from Azure IoT hub to Azure Streaming Analytics. Click test query to check if there are Json payload.

Power BI

Login to your Power BI account with the same account used during the authorizing process in Azure Stream Analytics. Expand My workspace and you should see your datasets under the datasets. Remember the  dataset (database) name and a table name used in streaming analytics? This should be the same.


Click your dataset and there fields of the table would appear on your right hand pane.


You may now be able to design your report. Add Event enqueue and the sum of SMC in the fields selection. Select an appropriate visualization such as a stacked area chart here.


Save your report.


Once saved, you may now create a new dashboard. Pin this report to a new dashboard and then click pin live.



You can also add more tiles in the dasbhoard by clicking edit and then add tile.


Select Custom Streaming Data and then click Next.


Your dataset should be available here. Select and click next.


Choose your visualization type, for this example, lets choose a gauge type.


On the tile details, use Soil Moisture Content and then click Apply.


Power BI is also available as an IoS App in the Apple Store.

Here’s my running dashboard for device5. I plan to use more devices, sensors and pumps. And that’s for another blogpost. Until then, happy coding!

No description available.

To test real time integration, try and posting data using PostMan and send data to the streaming analytics. We should be able to show data in the Dashboard.

No description available.


We have configured the IoT back-end and dashboards for our project as shown as in the red dashed box in this figure.


Our next is to set-up the Arduino and Hardware part of our project. 

Arduino Hardware

So for our bill of materials:

  1. ESP8266 / NodeMCU Arduino
  2. Node MCU Base Board
  3. LCD Display 20×4 I2C
  4. HW-080 Soil Moisture Sensor
  5. Jumper Wires Pack (M-F / F-F)
  6. Relay Module
  7. 5v Power Adapter
  8. Used USB Power Adapter and Cord
  9. Submersible Pump

I got most of my Arduino parts from the following stores here in Philippines.

Lets get started on assembling our hardware:


Follow this pin-out. Note that the soil moisture uses analog.


Instead of using a breadboard, I used a NodeMCU base board. This is so that I can power V5 PowerSource plus once completed, the project can already be deployable since the baseboard is more sturdy than a breadboard.


Connect the submersible pump to your relay. Use the normally closed on your relay. The relay turns connects the power or cuts off the power going to the submersible pump. Here’s my wiring:


At this point, the LCD can also be connected. Pin configuration is available in the diagram. Also attached the components with some glue stick to make it sturdy and easy to work with.


Casing is just a soap dispenser. Bought from SM Makati Smile (they got it all for youuuu…) bathroom section, near pillows 4th floor. Drilled some holes in the soap dispenser that would fit wiring and LCD screen. Since the NodeMCU needs power, a hole was also drilled on the side.

Here’s some measurements that I used.


Arduino Software / Firmware

I have created a simple repository that contains the relevant source codes for running this. Please do check-out this github link or directly to the storage.

Checkout the code inside the loop.


For Wi-Fi and IoT I used ESP8266-SimpleWifiManager that is available in Github.

Integrating Arduino with Azure IoT Hub

For sending data to the cloud I used HTTP

Here’s the relevant code that sends data to Azure IoT Hub.


First is the SSL Fingerprint. As of writing, the HTTP Client that I am using seems to need the SSL fingerprint. There are a number of ways out there like from the following sites. Use that to begin your HTTP Client instance.

The server address is the one you were given in creating the IoT Hub and adding a device. In my example, I already replaced <device-name> with device5.


Also add the SAS as a value of the Authorization Header as you did in postman.

Perform an HTTP post with a formatted payload with your soil moisture and other information, mine, I added the last pump interval. I measure the amount of time the pump has been running. Why you say? Just wanted to make sure my plants are not drowning. Smile

The Completed Project

Horay! Lets run this. Closing the lid and powering on ESP8266 and the Submersible pump. Also soil moisture is being shown in Power BI Dashboard in mobile.


When soil moisture is below the threshold, ESP8266 turns on the pump and waters the plant. Also shows in LCD like here:


No description available.

So that’s it! Thats my pet project for the weekend. Hope this was informative and had much fun as I did.

In conclusion, watering plants can relive stress. Automate it, put some IoT and Azure, now fun begins!



Ahh before you leave and trying this out, make sure you have enough credits to support your Steaming Analytics Job. Would cost you a bit. So here is my bill after a while:


Second is that I am not an electrical engineer. My background is IT and Software. My last experience in electronics is back when I took Industrial Electro Mechanics in Don Bosco Canlubang. Other than that, I usually break electronics at home. Soooo.. Please check your wiring and double check sources. (As any other source in the internet, fact check people!). So I provide no warranties whatsoever and use at your own risk.

Its a prototype, a hobby build. Do not use in production.

Finally this is more of an opinionated approach. Thoughts of my own and I do not represent any institution or my employer.



Simplified take on Arduino ESP8266 WiFi Management

So hello! If you are doing Arduino and ESP8266, I have just published my first Github repo on WiFi management for IoT devices.

More of a hobby, nothing professional but I try to have clean code with this project along with some OOP structure so that its reusable by anyone that would want to use this library. 


If this is something that may interest you, please feel free to go to

There are other approaches on ESP8266 WiFi management, however, most solutions based on my search would switch the WiFi mode of ESP8266 upon configuration leaving the ESP8266 hard to manage or reconnect to other WiFi hotspots. This project should support both WiFi mode while being able to serve a captive portal and a management portal.

This project is meant to be very simple and easy to use. Hope I can make it configurable and to serve HTTP request faster soon. (I am a beginner in C++ and Arduino though).

So, this post is more of an introduction. I would want to blog soon about the design decisions that went thru creating the library. 

Managed SSL in Azure

So I have a website hosted in an Azure Web App. Certificate expired and wanted to try out other SSL providers out there. Good thing that it seems Azure has already a Managed Certificate Service which is currently on preview. I wanted to try it out and share my experience here.


1. Create App Service Managed Certificate over TLS/SSL Settings > Private Key Certificates of your Web App.

2. Bind your Managed Certificate to the Web App.

3. Optional but recommended, redirect to HTTPS only.

Overall, its a more pleasant experience and I consider this as an upgrade from my 4 part HTTP-SSL blog where I attempt to get SSL from Digicert, install the SSL and do some workarounds on auto redirection. That post is here if you are still interested:

So, lets get started!

Open your Azure Portal and go to your WebApp and on the settings pane search for TLS or SSL. On the TLS/SSL settings page click Private Key Certificates and then Create App Service Managed Certificate.


On the Create App Service Managed Services pane, dropdown the app service host name that you wanted SSL for. Mine is my so after the validation you will be able to press the create button.


Wait for the certificate to be created..


Once created, it will be available at the Private Certificates Table below.


You can click the certificate to check the details which includes the expiry date. Self note, don’t forget this time to renew clip_image006


Now to actually use the certificate, click Bindings tab of the TLS/SSL settings and click Add TLS/SSL binding. Choose the domain you are assigning the private certificate and the certificate.


After adding it should appear as a new binding with hostname on the table below.


As an optional but I do recommend, use HTTPS only. It should always redirect your site to HTTPS since you already have your shiny new SSL certificate on your site.


I am using Chrome to test and it took a few hard refresh to see the new certificate. Also checked the SSL certificate and its good. See that lock?


Saw that its GeoTrust –

More info could be found here:

And thanks to Miguel and Azure Pilipinas FB page for the link!

On HTTP/2 today

Its 2018 and its a long overdue performance upgrade from the 1999 HTTP 1.0 and 1.1 Specs.  Its promise is to reduce the perceived load time in user perspective and also have more efficient use of server and network usage.

So lets try out and see if we could have more optimized results on one of my development environment.

The test would be using a single website, cache disabled using Chrome so that we could have a measuring tool for performance. We would perform two tests, before and after we turn-on the HTTP/2 version on our web servers.

My development site using HTTP 1.1


Same development site upgraded to HTTP/2:


Response time results: HTTP/2 Performs better in returning individual resources from the server.

Type HTTP 1.1 HTTP/2
Initial Document HTML 346 MS 57 MS
SVG Image XML 165 MS 47 MS
Site.min.js JS 65 MS 43 MS
Site.min.css CSS 57 MS 48 MS
Select2.min.js JS 85 MS 84 MS
Select2.min.css CSS 121 MS 49 MS

Size results: HTTP/2 Do compress the content, this shows that files served thru HTTP/2 is lower in size.

Type HTTP 1.1 HTTP/2
Initial Document HTML 2.6 KB 2.4 KB
SVG Image XML 1.4 KB 1.2 KB
Site.min.js JS 603 B 203 B
Site.min.css CSS 1.4 KB 1.1 KB
Select2.min.js JS 25.1 KB 24.8 KB
Select2.min.css CSS 3.1 KB 2.8 KB

Domain sharding Results: Need more verifications (Am sleepy na)

According to the previous spec (8.1.4 Practical Considerations) a client should not maintain 2 connections with any server or proxy. Thus we had to do a workaround where resources are located on a different server (Or in our case, different DNS name or using a Content Delivery Network). Now from our results:


I am thinking about not using CDN anymore if they are still using SPDY or 1.1, however looking at the results, cached:


Enabling HTTP/2

How did I enable HTTP/2. This site is hosted in Azure as an app service. To enable, search for the HTTP version on the Application Settings. Click save and refresh.


For the on-premise-premise, HTTP/2 is also available on Windows Server 2016/IIS 10 and Windows 10.

Using .NET Core SDK 2.1.0-rc1 with TFS Build

So .NET Core 2.1 RC is out. With its Go Live Support I am fairly confident that I could upgrade my super secret project.

Upgrading my projects was easy.

I download and installed the 2.1.300-RC1 from  and for VS2017 it should also show on your notifications window by the way.


Once my tools are upgraded I then did some minor changes on all of my CSPROJ.

Make sure that the target framework is 2.1 and the Package Reference version is 2.1.0-rc1-final. Rebuild and viola, .NET Core 2.1.  Here’s an example of one of my API’s.


I would not cover the other new and shiny things about .NET Core 2.1-RC like the one from Preview1 and Preview2 but certainly you can look it up here:

Publishing to Azure

So now for the harder part. I saw from a blog post that Azure App Service will start deploying 2.1 RC next week.


But the the impatient me, went ahead used the .NET Core Tool Installer.


I used the name from the release notes

Broke my build saying that the .NET Core Package name is wrong.

But I saw this GitHub Issue thread that says use “To use 2.1.0-RC -> to download SDK, use “2.1.300-rc1-008673”. And to download runtime, use “2.1.0-rc1”.

Viola! That works!


Now for the fun part.

Using UseHttpsRedirection and HTTPClient Extensions!

That’s all for now!

Robocopy /MT on Windows 10 (Revisited)

I blogged about the /MT switch of Robocopy way before on Windows 7 and I have been using it for the longest time for moving large amounts of files.

This time I had a chance to screenshot some personal files that I have moved to another drive. Since its not for work, might as well share this to everyone.

Note that these are not Solid State Drives and are just plain HDD and are used for storage only.

So I started with this command that moves everything even if the folder is empty on a specified destination and logs the operation on a .log file”

“robocopy <source> <destination> /mt:120 /E /move /log:robocopy.log”


Around 20GB-ish worth of files are being moved now.


And checking the disk activities while the copy is running.


And for the results!



RemoteFX 3D Video Adapter

Hey this would be a quick post or even a micro post about a new feature in Hyper-V called RemoteFX 3D Video Adapter that was just shipped out with Windows 10 and WS2016.

So I was doing a lab work for a Windows 2016 deployment and found new features such as the Remote FX.


It seems that I can already share my desktop GPU to my Hyper-V guest! Cool! I have already alot of use cases in my mind, like running my supervised learning ai on my Hyper-V. I just need to install Unity and VS on a VM, play around there so that my PC and Mac is not always maxed out (will try to constrain it now to a 4gb memory so that I could work while waiting for the operations to complete).

Anyway, back to this lab. Yes folks, still using ol’reliable WDS.