I’m thrilled to be included in this group of #technology #influencers and luminaries like Werner Vogels, Steve Wozniak and Mark Russinovich.
To become one of those technology influencers, it’s taken a lot of years of hands-on experience building mobile, cloud and Internet of Things solutions combined with writing books, speaking at conferences around the world, blogging, tweeting and mentoring.
#Mobile employees working around the world are best served by globally distributed #cloud services + replicated #data with low #network latency.
Your organization may have customers and employees distributed all over the world. These people have neither the time nor the patience to wait for data to travel great distances over land or though undersea cables. If your business is currently serving your constituents via an on-premises network or a regional data center, you’re not being responsive to their needs.
Take advantage of services provided by top-tier cloud providers with data centers distributed throughout the world. Narrow your list based on analyst and other trusted reviews of Platform as a Service (PaaS/Cloud development) and Infrastructure as a Service (IaaS/Virtualization) capabilities. Further narrow the list based on network capacity, redundancy, disaster recovery, data handling, support your existing server operating systems, databases, programming languages, and connectors to line of business systems.
Whether lifting and shifting existing systems or building new ones in the cloud, you must go beyond just load-balancing within one or two data centers. Your websites and APIs must be distributed globally so customers and employees are automatically directed to the nearest data centers for the best performance. Additionally, the databases that power these systems must be replicated everywhere so everyone is looking at the same data. One word of caution I’d pass along is to be cognizant of data sovereignty requirements that may limit data flow to a particular region or country.
Improve user productivity and increase revenue by providing fast and reliable access to corporate data to employees anywhere in the world. What is your organization doing to support its global workforce?
Learn how to digitally transform your company in my newest book, “Mobile Strategies for Business: 50 Actionable Insights to Digitally Transform your Business.”
If you’re attending Visual Studio Live! Redmond 2015, come check out my session on Making the Internet of Things Real with Azure #IoT Services.
At Microsoft, we believe that the Internet of Things starts with your things by building on the infrastructure you already have and using the devices you already own. Microsoft has played a central role in facilitating Internet of Things (IoT) forerunners including SCADA (Supervisory Control and Data Acquisition) and M2M (Machine to Machine) since the 1990s. We’ve provided real-time, embedded platforms to power sensors that no one ever sees plus advanced robotics, medical devices and human machine interfaces (HMI) just to name a few. Today, Microsoft Azure IoT services delivers hyper-scale telemetry ingestion, streaming analytics, machine learning and other components to unlock insights from the Internet of Your Things in order to transform your business.
To prevent the Internet of Things from becoming the largest attack surface in the history of computing, security at scale is paramount. #IoT
Any company that wants to be taken seriously as an IoT platform player has to provide cloud-scale telemetry ingestion while also delivering security to millions of events per second without skipping a beat. This is no easy task and therefore narrows down the field in this space dramatically. Microsoft Azure IoT services accomplishes this task through the use of Shared Access Signatures (SAS). They provide delegated, limited access to resources such as Event Hubs for a specified period of time with a specified set of permissions. Of course it does this without having to share the account access keys you created in the previous Event Hub article. You might remember creating a Shared Access Policy with Send permissions. You gave that policy a name and were given a connection string that includes the account access key which you used to test out a .NET IoT client. Good for testing. Career-limiting for production. That’s why you’re reading this article.
In regards to securely sending telemetry to Event Hubs, IoT devices and field gateways claim access to the Event Hub by presenting a SAS token. This token consists of the resource URI being accessed, and an expiry signed with the account access key. Basically, a URL-encoded string that is passed along every time telemetry is sent. Each IoT device needs its own distinct SAS token and that’s what you’re going to learn today.
To more easily create SAS tokens for your IoT clients, I want you to create a simple app to do the work for you. Launch Visual Studio and create a new C#, Windows Forms application and call it SASToken. From the Solution Explorer, right-click on References and select Manage NuGet Packages…
In the Search Online box type Azure Service Bus and install version 2.7.5 or later. Since you’ll be using the SharedAccessSignatureTokenProvider class to create a shared access signature for your publisher, add using Microsoft.ServiceBus; above the namespace with all the other using statements in the default Form class.
The next thing I want you to do is create a function called CreateSASToken() inside the Form class as shown below:
This function simplifies the creation of a SAS token by inputting values found on the Azure portal for your Event Hub. Let’s walk through the parameters of this function and where you can find the required values:
EventHubUri: This is found on the Dashboard page of your Event Hub under Event Hub URL. Don’t include the last part of the URL after the final dash /
EventHubName: This is found at the top of your Event Hub Dashboard page.
Publisher: This is a unique name you get to create for the IoT device that’s sending the telemetry to the Event Hub.
PolicyName: This is found on the Configure page of your Event Hub and is the name of the shared access policy you created with Send permissions.
PolicyKey: At the bottom of your Event Hub’s Configure page is a section called shared access key generator. Select the correct Policy Name from the dropdown box and copy the Primary Key in the text box below it.
Expiration: Enter the number of minutes you want your token to be valid. This TimeSpan code can be changed so you can use days or hours as well.
With the function up and running, you can now create unique tokens for each of your Publishers rather than insecurely using the same connection string for all of them. This also means that your Event Hub can prevent individual Publishers from sending telemetry if any of them have been compromised. To make better use of this function, follow along and build a simple data entry form.
Load the default Form in the Visual Studio and add the following UI controls and associated properties:
Label: Text = Event Hub Uri:
TextBox: Name = txtEventHubUri
Label: Text = Event Hub Name:
TextBox: Name = txtEventHubName
Label: Text = Publisher:
TextBox: Name = txtPublisher
Label: Text = Policy Name:
TextBox: Name = txtPolicyName
Label: Text = Policy Key:
TextBox: Name = txtPolicyKey
Button: Name = btnCreateSAS Text = Create SAS Token
Label: Text = SAS Token:
TextBox: Name = txtSASToken
In order to bring things to life, create a click event for the Button and add the following code:
The code calls the CreateSASToken() function you created and passes in the values you type or paste into the TextBoxes. I hard-coded in 60 minutes but you can make that any number you like and you could even add a NumericUpDown control. The function returns a SAS token as a string and displays it in the TextBox at the bottom of the Form.
At this point, go ahead and run the app you just built. Type in or paste the appropriate values from the Azure portal into the TextBoxes. I called my Publisher 007 but you can call it anything you want. Click the button and you should get a SAS token as shown below:
While you now have an easy way to create SAS tokens, this won’t suffice at large scale. You’ll need to use what you’ve learned here to build a secure, on-premises or cloud-based token service to support deployment to thousands or even millions of individual IoT devices.
With your unique SAS token in hand, it’s time to modify the the app you created in the previous Event Hub article. Load the ContosoIoTConsole solution in Visual Studio and get ready to make a few changes.
Just like you did with the SAS token app, add using Microsoft.ServiceBus; above the namespace with all the other using statements in the Program class. Next, delete the first two lines of code inside Main() where you previously created a connectionString and an EventHubClient. In place of the deleted code you’ll declare a string called sasToken and paste in the long SAS token string that was generated by the Windows app you just built. Next, you’ll declare a connectionString and use the ServiceBusConnectionStringBuilder along with your Service Bus URI, Event Hub name, Publisher name, and SAS token to create it instead of reading the account access key from App.config like the previous article. In the final, new line of code, you’ll create an EventHubSender based on this new connection string. Every other line of code below stays the same. Your updated ContosoIoTConsole app should look like the code below with your Event Hub values substituted for mine:
All that’s left to do is try it out by running the console app and then checking your Event Hub Dashboard a few minutes later to see if a new message arrived.
By following the directions and code in this article, you’ve made the leap to getting an IoT client to send telemetry to Event Hubs more securely. While Event Hubs has always required transport via TLS, by presenting a SAS token, Event Hubs knows who the IoT client is and what permissions it has. A SAS token’s ability to gain access to Event Hubs doesn’t last forever due to the expiration limitations you place on it when creating a new token which is a good thing. Furthermore, Event Hubs give you device blacklisting capabilities by revoking individual publishers based on the unique name you gave them. Expired tokens and revoked publishers will result in errors being thrown in the client code when a publisher attempts to send telemetry to an Event Hub. Keep in mind that when you do a mass deployment, your IoT clients and field gateways won’t have this information hard-coded like the example we just walked through. It must be encrypted and will often be baked into the hardware silicon as the IoT devices are being manufactured. Stay secure!
Microsoft Azure Event Hubs is a managed platform component of Azure #IoT services that provides a telemetry data ingestion at cloud scale with low latency and high reliability.
For your Internet of Things (IoT) scenarios, you can think of Event Hubs as the loosely-coupled beginning of an event pipeline that sits between event publishers like sensors and event consumers like Azure Stream Analytics. With industry analysts predicting tens of billions of “Things” sending telemetry over the Internet in the coming years, most data ingestion solutions won’t be able to handle the onslaught of information. Event Hubs and Azure are designed for this very scenario. Unlike queues, Event Hubs implement partitions (shards) to support massive horizontal scale for the processing of a million events per second. Consumer Groups provide consuming applications an independent view of the Event Hub from which to read the telemetry streams that can lead to complex event processing, storage or other downstream services.
Now that you have a brief summary of this event ingestion technology, it’s time to step through the creation of your own Event Hub so you can start bringing your IoT scenarios to life.
Go to your Azure Portal and click the Service Bus icon on the left side of the page as shown below:
If you have an existing Service Bus namespace, then you can reuse it. Otherwise, click Create a New Namespace.
The Create a Namespace dialog will pop up on your screen as shown below:
In this dialog you will enter a unique Namespace Name, select a Region, select a Subscription to bill against, choose Messaging as the Type in order to support Event Hubs and choose Standard as the Messaging Tier. This allows you to support a sufficient number of Brokered connections (AMQP) into the Event Hub and up to 20 consumer groups leading out of the Event Hub. Click the checkbox when you’re done.
With your Service Bus namespace created, click on the appropriate highlighted row as shown below:
Click Event Hubs from one of the choices across the top of the page to bring up the page shown below:
Click Create a New Event Hub.
Select Quick Create to which should be sufficient for most IoT scenarios.
Enter a unique Event Hub Name, select the same Region as your Service Bus Namespace, select a Subscription to bill against, select the Service Bus Namespace you previously created and then click the Create a New Event Hub checkbox.
With your Event Hub created, click on the appropriate highlighted row as shown below:
Click Configure from one of the choices across the top of the page to bring up the page shown below:
The Message Retention text box allows you to configure the number of days you’d like to have your messages retained in the Event Hub with a default of one day. The Event Hub State combo box allows you to enable or disable your Event Hub. Following the Quick Create path gave you a Partition Count of 16. This value is not changeable once it’s been set so you might consider a Custom Create of your Event Hub if you need a different value. Partitions refer to a scale unit where each one supports message ingress of 1 MB/sec and an egress of 2 MB/sec. You can set the number of Event Hub throughput units on your Service Bus Scale page. The default value is set to one.
In your next configuration step, you will create two shared access policies to facilitate security on your message ingress and egress as shown below:
Click into the Name textbox and enter an ingress name then select the Permissions combo box and select Send. Repeat the process on the newly created row below by adding an egress name and then select Manage, Send, and Listen from the combo box. Click the Save icon at the bottom of the page and then you’ll notice that shared access keys are generated for both your message ingress and egress policies. Those keys will be used to create the connection strings used by your IoT devices, gateways and event consumers like Azure Stream Analytics.
To view and use those connection strings, click Dashboard at the top of the page and then click the Connection Information key icon at the bottom of the page to bring up the Access connection information dialog as shown below:
This is where you will go to copy the Shared Access Signature (SAS) key connection strings into your code to authenticate access to entities within the namespace. The authentication and security model ensures that only devices that present valid credentials can send data to an Event Hub. It also ensures that one device cannot impersonate another device. Lastly, it prevents a rogue device from sending data to an Event Hub by blocking it. Of course, all communication between devices and Event Hubs occurs over TLS.
To wrap things up, click Consumer Groups from one of the choices across the top of the page to bring up the page shown below:
Rather than using the $Default Consumer Group, it’s a good idea to specify one or more of them yourself to create views of the Event Hub that will be used by things Steam Analytics. This is a simple process that starts with clicking the + Create icon at the bottom of the page.
The Create a Consumer Group dialog will pop up on your screen as shown below:
Type in a meaningful name in the Consumer Group Name textbox and then click the checkbox to save and exit.
Some of you may be wondering why do you need to use Event Hubs for event ingestion when you’ve been uploading data from disparate clients to servers using SOAP + XML and REST + JSON for more than a decade. The answer has to do with wire protocol efficiency and reliability. By default, Event Hubs use the Advanced Message Queuing Protocol (AMQP) which is an OASIS standard. This is a binary, peer-to-peer, wire protocol designed for the efficient, reliable exchange of business messages that got its start on Wall Street. If it’s good enough for the critical financial transactions between the world’s largest investment banks and stock exchanges, I’m pretty sure it’s good enough for the rest of us.
At this point, your Event Hub should be up and running. The next step is to get a device sending telemetry into your Event Hub so you can see it working. To test this out, I’ll walk you through the creation of a simple Windows console application.
To get started, create a new C# Console Application in Visual Studio 2013 and call it ContosoIoTConsole as shown below:
In the Solution Explorer, right-click on References and select Manage NuGet Packages…
In the Search Online box type Azure Service Bus.
Install Microsoft Azure Service Bus version 2.6.1 or later.
After that, right-click on References again and add a reference to System.Configuration so your application can read from configuration files.
In the Solution Explorer, open the App.config file. You’ll notice that it’s already filled with various Service Bus extensions. I want you to scroll down to the appSettings section at the bottom where you’ll see the beginnings of a Service Bus connection string waiting to be filled-in with your specific Event Hub data as shown below:
Replace [your namespace] with the name of the Service Bus Namespace you created in the Azure portal. I called my namespace ContosoIoT.
As you slide across to the right, you’ll see SharedAccessKeyName=. I want you to replace RootManageSharedAccessKey with the name of the data ingress shared access policy you created in your Event Hub. I named mine TelemetrySender.
In order to replace [your secret] with the correct value, go to the Dashboard page of your Event Hub and click the Connection Information key at the bottom of the page. A dialog containing access connection information with connection strings will appear. Copy the connection string from the data ingress shared access policy you created and paste it into notepad because it contains too much information. Just copy the SharedAccessKey value at the end of the connection string into [your secret] and then save and close the file.
Hopefully along the way you noticed that you can just paste the entire connection string into the value to get the same result as the direction above.
Keep in mind that when you deploy your individual devices to production, they won’t all be using this same key like you’re doing now for this test scenario. SAS tokens based on the shared access policies must be created and used by each device sending data to Event Hubs.
Now it’s time to jump in and write some code. Open Program.cs and add:
using Microsoft.ServiceBus.Messaging; using System.Configuration;
with all the other using statements found above the namespace.
The actual three lines of code needed to send the IoT equivalent of “Hello World” to your Event Hub is shown below:
First you grab the connection string you created in App.config. Next, you create an EventHubClient based on the connection string and the name of your Event Hub. Lastly, you call the Send method to pass along encoded event data as AMQP. In this scenario you’re only sending a simple string but you can send classes as well.
Run this console app several times and then wait a few minutes before checking the Event Hub dashboard in your browser since it doesn’t update in real time. Verify that your “Hello IoT” messages made it to their destination. Congratulations!
You’re now up and running with the basics of high-speed, high-scale telemetry ingestion in Azure for all your IoT and M2M scenarios. Now it’s time to move from a simple “Hello IoT” example to something more real-world like a street parking scenario found in a smart city.
One feature of Smart Cities is to help drivers find free parking spaces on city streets using their smartphones, tablets or in-car navigation apps. This is accomplished by embedding low-power, magnetic sensors in the streets near the curbs where free or metered parking spots are available. These sensors detect the absence or presence of a large metal object above them and relay this Boolean (Yes/No) state via a low-power, 6LoWPAN mesh network to a nearby field gateway that’s probably mounted on a street light.
Modelling this data via your existing console app is trivial and only requires the addition of a class + minimal code to hydrate an object with data and serialize it for transport. To get started, return to your ContosoIoTConsole solution in Visual Studio, right-click on References and add a reference to Newtonsoft.Json to support serializing your new class as JSON.
Next up, right-click on your existing ContosoIoTConsole project and add a public class called StreetParking that looks like the code shown below:
For this example, you’re just going to model a single street block and GPS coordinates with four available parking spaces to choose from.
Jumping back to the Program class from the previous Hello World example, you’ll be re-using the connectionString and EventHubClient code at the top and bottom of Main() below:
Since you’ll be serializing your StreetParking class as JSON, add using Newtonsoft.Json; above the namespace with all the other using statements.
The new code you’ll add above includes instantiating a new StreetParking object, hydrating all its properties with data, serializing the object as a JSON string and then sending the data to your Event Hub. With these code additions made, run your console app a few times to verify that your street parking event arrived in the Event Hub.
I’m pleased to announce that my newest book, “Keeping Windows 8 Tablets in Sync with SQL Server 2012,” is now available for sale.
Spending a decade travelling the globe to help the world’s largest companies design and build mobile solutions had taught me a few things. Large organizations are not interested in constantly running on the new technology hamster wheel. They prefer to leverage existing investments, skills, and technologies rather than always chasing the next big thing. Don’t believe me? Take mobile and the cloud for example:
In 2003 I was building Pocket PC solutions for large companies that wirelessly connected apps on those devices to SAP. I assumed mobile was going mainstream that year. I was wrong. I was early. Mobile apps wouldn’t explode until the end of the decade with the iPhone 3G.
In 2004, my partner Darren Flatt and I launched the first cloud-based mobile device management (MDM) company to facilitate software distribution and policy enforcement on early smartphones and handhelds. Early again. MDM didn’t get big until the end of the decade.
At PDC in 2008, my company launched our cloud offering called Azure. We skipped directly to the developer Nirvana called Platform as a Service (PaaS). I spent a few years doing nothing but speaking and writing about Windows Phones communicating with Web Roles. Turns outs companies wanted to take smaller steps to the cloud by uploading their existing servers as VMs.
Being early over and over again taught me how the real world of business operates outside of Redmond and Silicon Valley. Businesses need to make money doing what they do best. Where appropriate, they will use technology to help them improve their processes and give them a competitive advantage. So let’s cut to the chase and talk about why I wrote my new book:
Tablets and Smartphones are taking over the world of business and outselling laptops and desktops. This is a well-known fact and not speculation on my part.
There are 1.3 billion Windows laptops, tablets, and desktops being used all over the world. Windows 7 is in first place with Windows XP in second.
Companies run their businesses on Microsoft Office combined with tens of millions of Win32 apps they created internally over the last 2 decades. Intranet-based web apps also became a huge force starting in the late 90s.
Tools like Visual Basic, Access, PowerBuilder, Java, and Delphi made it easy to rapidly build those Win32 line of business apps in the 90s and helped ensure the success of Windows in the enterprise.
Many of those developers moved to VB and C# in the 2000s to build .NET Windows Forms (WinForms) apps that leveraged their existing Visual Basic skills from the 90s.
Some businesses built Service Oriented Architecture (SOA) infrastructures of Web Services based on SOAP and XML over the last decade in order to connect mobile devices to their servers. Most business did not, and instead opted for out-of-the-box solutions that didn’t require them to write a lot of code so they could get to market faster.
While the “white collar” enterprise recently started building business apps for the iPhone and iPad, the “blue collar” enterprise has been building WinForms apps for rugged Windows Mobile devices using the .NET Compact Framework and a mobile database called SQL Server Compact for over a decade.
Most businesses run servers in their own data centers. Many of them are using virtualization technologies like Hyper-V and VMware to help them create a private cloud.
Of the businesses that have dipped their collective toes in the public cloud for internal apps, most of them are following the Infrastructure as a Service (IaaS) model where they upload their own servers in a VM. Just look at the success of Amazon and the interest in Azure Infrastructure Services.
So the goal of my new book is to help businesses transition to the tablet era in a way that respects their existing investments, skills, technologies, enterprise security requirements, and appetite for risk.
Since I’ve been involved in countless mobile projects where companies used the Microsoft data sync technologies already baked into SQL Server and SQL Server Compact, I decided to illustrate how to virtualize this sync infrastructure with Hyper-V. With an eye towards existing trends that are widely embraced, this gives businesses the flexibility to use this proven technology in a private, public, or hybrid cloud. Companies authenticate their employees against the same Active Directory they’ve used for over a decade. I’m deadly serious about security and you’ll be glad to know the technology in this book handles it at every tier of your solution with Domain credentials plus encrypted data-at-rest and data-in-transit. You also have the option of synchronizing mobile data with any edition of SQL Server 2005, 2008 or 2012 using Microsoft sync technologies that takes care of all data movement plumbing. Your development team avoids writing thousands of lines of code to create web services, sync logic, change tracking, error handling, and retry logic. With Microsoft lowering risk to your project by taking care of the server backend, security, and data sync technologies, your team can focus on building the best possible Windows 8 tablet app for the enterprise.
Speaking of tablet app development, it’s important to show you a path that doesn’t force you to learn all-new tools or programming languages, frameworks, or paradigms. As a developer, you get to keep using Visual Studio along with the Desktop WinForms skills you’ve mastered over the last decade. Better still, you can accomplish everything using the free version of Visual Studio 2012. While you might be thinking Windows 8 tablet solutions must be created via Windows Store apps, this is not the case. Instead, I show you how to apply Modern UI principles to Desktop WinForms apps that are full-screen and touch-first. Concepts like content over chrome, use of typography, and UI elements with large hit targets are all covered in detail. I also respect your investment in Windows 7 laptops and tablets by ensuring your touch apps are backwards compatible and keyboard + mouse/trackpad friendly.
If you’re looking to build a new Windows 8 tablet app using what you have and what you know, this book is for you. If you’re looking to port an existing Windows XP or Windows Mobile WinForm app to a Windows 8 tablet, this book empowers you with the skills to make your porting effort a successful one.
The takeaway is you don’t have to scrap your existing investments to participate in the tablet revolution. I purposely made the book low-cost, hands-on, short, and to-the-point so you can rapidly build mobile solutions for Windows 8 tablets instead of wasting your time with theory. Click here to take “Keeping Windows 8 Tablets in Sync with SQL Server 2012” for a spin so you can start building mobile apps for the world’s first and only enterprise-class tablet today.
While he’s most proud of Windows Azure and SQL Azure, he also gives our competitors their due by mentioning that they have out-executed us when it comes to mobile experiences. He harps on the subject of how complexity kills and then challenges us to close our eyes and form a realistic picture of what a post-PC world might actually look like.
Ray goes on to state that those who can envision a plausible future that’s brighter than today will earn the opportunity to lead. His ultimate dream is to move us toward a world of :
Cloud-based continuous services that connect us all and do our bidding. These are websites and cloud-based agents that we can rely on for more and more of what we do. On the back end, they possess attributes enabled by our newfound world of cloud computing: They’re always-available and are capable of unbounded scale.
Appliance-like connected devices enabling us to interact with those cloud-based services. This goes beyond the PC and will increasingly come in a breathtaking number of shapes and sizes, tuned for a broad variety of communications, creation & consumption tasks. Each individual will interact with a fairly good number of these connected devices on a daily basis – their phone / internet companion; their car; a shared public display in the conference room, living room, or hallway wall.
As a Mobility Architect at Microsoft, I’m excited that my commitments align with this vision in connecting the Peanut Butter of the Cloud with the Chocolate of devices. Wireless data networks, bandwidth, latency and signal coverage are the wildcards when it comes to making this vision a reality. That’s why you’ll always see my concern for this Wireless wildcard reveal itself in all the Cloud-connected mobile architectures I design.