Tag Archives: Data

10 Ways to Achieve Internet of Things Success for your Organization

The IoT + IIoT Megatrend is in Danger of Stalling

Many of you who are involved in one of the #IoT segments (industrial, healthcare, consumer, etc.) are currently living in PoC hell. Your pilots, trials and proof of concepts are not making the jump to production for a variety of reasons. I think it’s time to push the reset button on how we convey the value of IoT and how we deliver solutions. The best place to start is by listening to customers.

As it turns out, customers aren’t interested in hearing how smart you are or which esoteric technologies you’re using to build IoT solutions. The only reason they’re talking to you is because they’ve heard Internet of Things solutions can save them money, reduce unplanned downtime/non-productive time, optimize operations, improve worker safety, boost product quality, lower risk and many other compelling value props. Here’s a quick list problems and solutions to get you started:

  1. Customers are finding all the pieces to the IoT puzzle to be too complex. You need to focus on extreme simplicity and reduce friction at every tier of an IoT solution. Hundreds of pages of code examples isn’t working.
  2. Customers don’t have the skill sets needed to work with IoT solutions. Good enough has to be good enough, so stop using technologies and protocols that no one has ever heard of and embrace pervasively adopted tech that everyone already understands. If the tech you’re using isn’t familiar to customers, they’ll be uncomfortable about using your solution.
  3. Customers have heard about large-scale, IoT hack attacks and are reluctant to move forward due to security concerns. Security and privacy must be baked-in to your IoT solution from the get-go and defense in depth must be practiced at every tier of the solution. You must also respect a customers data governance and sovereignty requirements even if it means delivering a 100% air-gapped solution.
  4. Customers struggle to achieve an acceptable return on investment on their IoT solutions. Despite lower costs for all the components required to build an IoT solution, when a customer strings together sensors, microcontrollers, communications networks, storage, middleware, servers, analytics, and integration software, it’s possible that the combined cost could exceed the expected ROI. It’s critically important to beat-up on those costs to stay well-within the ROI envelope.
  5. Customers don’t want another data silo. Too many IoT solutions are focused solely on capturing data from machines and keeping it within their respective systems. It’s important to integrate with a customer’s existing databases, CRM, ERP and other systems no only to add context to machine data but to take actions on insights. Telling a customer they can write code to call APIs on their backend systems is the wrong answer. Make it easy.
  6. Customers keep hearing you must combine Artificial Intelligence with IoT in order to derive value. The tech industry must stop sending this message because it’s dead wrong and it’s scaring customers away. The average person doesn’t know anything about AI except that they think SkyNet is going to take over the planet and robots will be our overlords. There’s tremendous value in connecting your people and machines to gain real-time visibility and situational awareness over your operations. There’s additional value in layering even the simplest analytics to drive decisions and automation. None of this is rocket science and it’s stuff your customers can easily wrap their head around.
  7. Customers who are pitched horizontal IoT platforms quickly become paralyzed. Stop leading with generic, horizontal IoT platforms that try to be all things to all people because it doesn’t work. Customers are not interested in writing code to implement one of many millions of IoT use cases on the platform you’re selling. Your sales motion should include knowing your customer’s business and always leading with vertical solutions to problems they already want to solve.
  8. Customers often find the tech needed to create a smart, connected product eats too much into product profit margins. IoT-enabling products is a super-important way to provide better, ongoing customer service. Especially when those products come with warranties or SLAs that must be met, companies absolutely require IoT capabilities to reduce their risk and eliminate service calls that eat into profits. The sensors, microcontroller, power source, and connectivity for an individual product must always represent the smallest percentage of the total product cost to ensure mainstream adoption. Otherwise, only early adopters will use your smart, connected product.
  9. Customers are unsatisfied with the results they expected from analytics applied to IoT data. This often points to poor data quality and/or unlabeled data. Garbage in, garbage out. Ensure your IoT system is labeling incoming data points as well as mapping unintelligible items like PLC registers to something a human can understand. It’s also super-helpful if your IoT systems knows the data types and units of measure of the incoming data points inside captured data sets to help both simple and advanced analytic systems make sense of the data. Don’t overwhelm customers by delivering 100% of data communicated by endpoints into an IoT system. For the most part, de-duplicate incoming data and only send anomalous data values that stray outside acceptable limits.
  10. Customers have grown tiresome of IoT projects that take too long. I’ve heard of managers who’ve green-lighted IoT projects being asked to leave after 3 years of boiling the ocean to drive value at an organization. Don’t try to boil the ocean anymore. Find small, targeted use cases that can be tackled in just a few months to get tangible, quick wins. When everyone can see the value, move on to the next small project while continuing to build confidence and grow support across the organization. Remember to eat the IoT elephant just one bite at a time.

Keep it simple to achieve success!

IoT and Twitter

Why the Internet of Things is as Simple as Twitter

Yes, IoT + IIoT and Twitter are truly birds of a feather.

Twitter is made up of people who have something to say. These people express themselves by Tweeting. Oftentimes, no one is listening. There are other people on Twitter who choose to follow those Tweeters in order to listen to what they have to say. Those people are called Followers. These folks often follow lots of Tweeters to understand the state of their collective minds. A Follower gets notified when a Tweeter they’re following says something. Through the clever use of Hashtags, followers can also choose to search for specific topics aggregated across many Tweeters in order to derive larger insights. Depending on the insight, the Follower takes action. Sometimes, a Follower wants to say something to a Tweeter. They can do this with a Direct Message (DM). Of course, a Follower can only send a DM if the Tweeter has authorized this by following the Follower back. The Follower may say something to the Tweeter that either changes her behavior or updates her state of mind.

You never know.

The Internet of Things is made up of machines that have something to say. These machines express themselves by Publishing their telemetry data to some nearby or far away computer system over a communications network. Oftentimes, no one is listening and that data just piles up. There are computers, people, apps, analytics, machine learning and automation systems who are interested in what the machines have to say. They are called Subscribers. They Subscribe to lots of Publishers in order to know the current state of their collective health or performance. A Subscriber often gets notified when a Publisher streams new data which allows them to process that information in near real-time. Subscribers can also choose to search through data swimming in a lake to derive larger insights. Depending on the insight, the Subscriber takes action. Sometimes, a Subscriber or some other endpoint wants to send data to a Publisher or group of Publishers. They can do this through a Command and Control channel. Of course, a Subscriber can only send a message if the Publisher has authorized this action. The Subscriber might send a Command that either changes the Publisher’s behavior or updates its configuration.

You never know.

I realize it’s easy to get overwhelmed with the sheer complexity of these systems that are transforming our world. That’s why it’s important to maintain the simplest view of what these IoT systems are actually doing. Explaining it to others gets easier which allows you to focus on the specific element that drives value.

Digital Trends and Predictions for 2018

With software and adjacent technologies continuing to eat the world, we see the pace of #digital transformation accelerating in 2018 as organizations strive to enhance their customer and operational intelligence.

Organizations will grapple with a variety of digital technologies and skillsets this year to become more data-driven in order to improve their agility and decision-making capabilities. As always, they’ll be looking for ways to simplify operations and get more done with less. We predict the concepts and trends listed below will light a path for organizations to show them the way forward:

  • Climbing the Stairway from the Edge to the Cloud

The ongoing journey to move data, apps and other digital assets from private, on-premises data centers to public clouds will continue unabated as organizations look to reduce or eliminate internal ICT functions and responsibilities. Even in the midst of cutting costs, organizations will still struggle with concerns around cloud vendor lock-in via PaaS which will benefit IaaS virtual machines, container technologies like Docker and container orchestration technologies like Kubernetes, Docker Swarm, Mesos and Marathon. Overall, Amazon AWS plus Microsoft Azure and Office365 will continue to be the biggest beneficiaries of the public cloud megatrend. Along the way, one of the stair steps that remains on-premise is something called the Fog or the Edge. If you’re familiar with how content delivery network (CDN) proxy servers around the world cache and speed the delivery of Web content to your browser, Edge gateway devices do something similar. With more and more of an organization’s compute occurring in distant, public clouds, Edge devices residing on the local network can cache, aggregate, analyze and speed up cloud content to give employees inside the office a better experience. Edge devices can also be used with the Internet of Things where they connect to machines and cache, aggregate, and analyze data locally instead of waiting for that data to be transported to a distant cloud. Since neither people nor machines are vary tolerant of too much latency, expect the adoption of Edge gateway devices and associated local storage to surge in 2018.

  • Enhanced Networking Inside and Out

As organizations reduce the number of digital assets and activities that take place in-house, the primary role of ICT departments will be to create and maintain fast, reliable connectivity via wired and wireless technologies. Wired networking will be “more of the same” as we push speeds forward with fiber optics and Gigabit Ethernet to shuttle employees out to the Internet. Wireless is where things get more interesting. Inside the office, organizations will continue rolling out 802.11ac Wi-Fi access points running in the 5 GHz band to deliver data and high-bandwidth content like HD video to any device. Outside, the 3GPP has officially signed off on the first 5G specification which promises to deliver greater bandwidth, lower latency, better coverage, lower battery consumption and a higher number of simultaneously connected devices. As you might imagine, it will take some time to roll out technology based on this spec so we will look to get more mileage out of 4G technologies like LTE Advanced. On the slower side of things, you have Low-Power, Wide-Area Network (LPWAN) technologies that are making great strides for certain Internet of Things use cases. The ability to create a large wireless network in places where no cellular coverage exits is compelling for organizations capable of managing such a system. If you have devices or machines that don’t send much data every day, require years of battery life, or need to send data over long distances, one of the many LPWAN technologies might be a good fit. Whether you’re inside or outside, looking for narrowband or broadband, there’s plenty of wireless choices for organizations in 2018.

  • Mobility for People and IoT for Machines

While the mobile device revolution has been the biggest megatrend of this new century, the torch has now been passed to the Internet of Things. When you think about it, they’re not terribly different from each other except for the endpoints. Mobile device endpoints are proxies for people and Thing endpoints refer to machines (intelligent or otherwise). They’re both sending data about themselves and other topics of interest over a network. Both interact with apps, analytics and other on-prem or cloud data sources to derive value and business intelligence. In order to regain a level of simplicity and perhaps sanity, organizations will push back against the use of multiple enterprise platforms for Mobile people and IoT machines. Additionally, many organizations will wring their hands of having to understand an alphabet soup of protocols and myriad IoT standards and revert to using the same Web and Internet standards they already understand. Just like they currently do with Mobile and the Web, organizations will insist that IoT sends and receives JSON data to and from URLs over HTTP/REST while being displayed via HTML5, secured with TLS and brought to life with JavaScript. This use of familiar, widely-used, “good enough” Web technologies will win the day over the more advanced but esoteric technologies currently employed by IoT platforms. This move to simplicity and familiarity will reduce friction and help the Internet of Things deliver value and fulfill its promise the way the Mobile, Web and the Cloud have. Expect big changes in IoT for 2018 along with a big shakeout of the hundreds of Internet of Things platform companies.

  • Digital Twins make Everything Digital

The rise of Digital Twins will give every organization the starting point they’re looking for to begin their Digital Transformation. A Digital Twin is essentially a digital representation of a physical object. It can be a machine, a person, a complex mechanical subsystem, a collection of machines working together on an assembly line, or even a process. These twins have attributes or properties that describe them like a person’s heart rate or a motor’s temperature or current revolutions per minute (RPM). Organizations can assign key performance indicators (KPIs) to the current values of these properties. A red heart rate KPI might be 200 whereas a green motor temperature KPI might be 200 degrees Fahrenheit. Digital Twins can exhibit behavior by executing programming language and/or analytics code against the combination of their current property values and associated KPIs. Not only does this bring everything in an organization to life, it also facilitates the running of simulations to see how things will behave when different types of data points are fed to these Digital Twins. This is definitely the most promising and exciting technology for 2018.

  • Security, Privacy and GDPR cause Organizations to Stumble

Unrelenting cyberattacks keep organizations in a defensive posture rather than moving forward with important digital initiatives and deployments. While we won’t cover the myriad security steps every organization must follow in order to stay ahead of individual and state-sponsored hackers, this is one of the most important functions of an ICT department. Organizational leaders who don’t take this seriously by not funding the appropriate security technology or staffing the appropriate security employee headcount do so at their own peril. Needless to say, organizations must prioritize the privacy and protection of data, people (employees and customers), and systems if they want to remain viable. To turn up the heat a bit, the European Union’s General Data Protection Regulation (GDPR) becomes enforceable on May, 25 2018. This regulation gives control back to EU citizens and residents over their personal data by strengthening data protections for all individuals within the  European Union as well as the export of personal data outside the EU. Quite a few companies operating in countries across the globe play it fast-and-loose with the security and privacy of individual data without user consent. This comes to an end in May when companies can be fined  up to €20 million or 4% of their global annual revenue, whichever is greater, for violating this regulation. Any company operating in the EU must obtain explicit consent for all data collected from an individual as well as reason/purpose of using and processing that data. Additionally, that user consent may be withdrawn. Many companies around the world haven’t made the necessary changes to their digital systems to be compliant with GDPR and will be in for a rude awakening in 2018. Data privacy and security matters in a big way.

  • Making Sense of an Avalanche of Data with Advanced Analytics

While data and analytics systems have been around for decades, the amount of data collected for analysis by organizations has increased exponentially. With a 50x growth rate from machines alone, the Internet of Things has become the newest data source for organizations to analyze. Lots of little data integrated from people, machines and business systems adds up to an overwhelming amount of Big Data to make sense of. Luckily, there are an increasing number of streaming and batch analytics systems and tools to tackle this job. Making this trend better is that most of these technologies are open source and free which helps level the playing field between small, mid-sized and large organizations with varying amounts of money to spend. Head over to Apache.org. Another interesting trend in data science is how Python has surpassed R as the most popular language for Machine Learning. An increase on online courseware, an abundance of scientific libraries, and the fact that Python is one of the easiest programming languages to learn, means you don’t always have to be a PhD in Statistics to get the job done. Virtually every organization in the world is looking for Machine Learning/Deep Learning expertise, so this trend should help the supply side of this equation. The last analytics trend that is coming on strong in 2018 has to do with where data is analyzed. It will no longer be the exclusive domain of the cloud or large clusters of servers. The need to answer questions and make decisions more quickly is driving analytics of all types out to the Edge. Thanks to Moore’s Law and the need to eliminate latency, more and more edge gateway devices will be performing IFTTT and even Machine Learning predictions (with models trained in the cloud). There’s no shortage of important trends that are simplifying advanced analytics for organizations in 2018.

Clearly, 2018 is going to be a transformational year where properly-equipped decision-makers and leaders can shift their organization into the next gear to accelerate their digital transformation. Hold on tight.

Russell Wilson

The Industrial Internet of Things is Like Football

The Industrial Internet of Things is a lot like football. Sensors relay data to devices like a center hikes the ball to a quarterback.

Devices send telemetry to IIoT platforms like a quarterback passes the ball to a receiver. IIoT platforms ingest data like a receiver catches a pass. I think you get the idea.

Is your company enjoying positive outcomes through the use of an Industrial Internet of Things platform?

Book Cover

It’s Time for one Mobile Database to Rule Them All

Migrate Win32 applications using a mobile #database like FoxPro, dBase, Access and #SQL Server Compact to #SQLite across all #mobile devices.

If it weren’t for desktop databases and learning SQL, my career as a developer may never have launched. I learned dBase for DOS in college, moved on to Paradox when Windows arrived on the scene and then fell in love with Access. I want to take this moment to say “I’m sorry” to all the IT departments that watched in horror as workgroup-level Access databases spread like wildfire on NetWare, Windows for Workgroups and NT servers to take over the corporate world. Employees who weren’t developers or DBAs were empowered to build their own solutions.

When devices for the mobile enterprise arrived in the late 90s and early 2000s, new databases like Sybase SQL Anywhere and Microsoft SQL Server Compact picked up where their desktop forbearers left off. These tiny relational engines brought serious business apps to life with built-in data sync with server databases. Today, platforms like iOS, Android and Windows are the biggest game in town and the only mobile database that runs on all of them is SQLite. From a pragmatic standpoint, this open source, cross-platform database with ACID (Atomicity, Consistency, Isolation, Durability) support should be your choice to give enterprise mobile data apps the broadest reach. Don’t worry about SQLite just being the database flavor of the week. It supports SQL-92 and works with most programming languages. It has a public domain license and has been around since the year 2000. It also happens to be the most widely deployed database in the world.

Improve user productivity and increase revenue by using a mobile database that works with every device and keeps your apps working with or without connectivity. Which desktop, mobile or embedded databases are you currently using?

Learn how to digitally transform your company in my newest book, “Mobile Strategies for Business: 50 Actionable Insights to Digitally Transform your Business.”

Book Cover

Click here to purchase a copy of my book today and start transforming your business!

Book Cover

Keep your Mobile Data Safe when Apps Talk to Each Other

Convert Win32 applications using local interprocess communications (IPC) to #mobile #apps that securely send #data to each other via contracts.

In the 90s, platforms and programming languages allowed developers to call functions that were increasingly farther away from the calling code. Calling into subroutines gave way to instantiating classes to call functions. Calling exported functions in separate C DLLs gave way to using Object Linking and Embedding (OLE) to call functions in separate programs. You could even embed the UI of a different program like Excel inside your app.

Developers went nuts with this stuff and started calling functions or passing messages to other local apps using Named Pipes, Mailslots, shared databases, TCP, UDP, message queues and shared files. On Windows Mobile, point-to-point queues were used with multiple executables to get around app memory limits. The problem with IPC is that security took a back seat and apps were just asking to be hacked as they listened for incoming connections like little web servers.

Today’s modern mobile platforms don’t allow this. Platforms require things like contracts, intents and extensions. They declare API interactions and what information can be shared between two apps as well as the files they can open. Users are prompted to give their permission to this type of interaction between apps which prevents data leakage at the device edge.

Reduce risk to your business by migrating your apps to a more secure method of data sharing between app sandboxes. What is your organization doing to secure app data sharing?

Learn how to digitally transform your company in my newest book, “Mobile Strategies for Business: 50 Actionable Insights to Digitally Transform your Business.”

Book Cover

Click here to purchase a copy of my book today and start transforming your business!

Book Cover

Reduce Business Risk by Enforcing Security Policies on Data with Digital Rights Management

To enforce #mobile data #security policies directly, get an #EMM solution with #digital rights management to protect #data where it flows & rests.

So far, our EMM journey to secure corporate data has dealt with the issue by broadly securing the entire device via MDM or more narrowly securing the apps that deliver the data using various MAM techniques. The application of security can get narrower still.

The use of digital rights management (DRM) allows IT departments to apply policies directly to documents keeping data secure no matter where it flows or resides. Sometimes DRM is clumped-in with the broader mobile content management (MCM) component of EMM. This security applied directly to data is an effective method of DLP using a combination of enterprise directory services, encryption, user identity along with server and client software to keep information in sensitive files from being viewed by the wrong people or systems.

Imagine the scenario where a confidential business document is uploaded to an Internet file sharing provider or emailed to a competitor. Traditional corporate security mechanisms like firewalls or file server access controls lists won’t save you in this situation. If DRM encryption and security policies were previously applied to this document, it would be unreadable by anyone who tried to open it. This is arguably the most difficult of the EMM security components so not many vendors will offer this.

Reduce risk to your organization by keeping sensitive data secure no matter where it travels or where it rests. What is your company doing to protect its critical data?

Learn how to digitally transform your company in my newest book, “Mobile Strategies for Business: 50 Actionable Insights to Digitally Transform your Business.”

Book Cover

Click here to purchase a copy of my book today and start transforming your business!

Book Cover

Improve User Productivity by Utilizing Cloud Services to Better Serve Mobile Employees

#Mobile employees working around the world are best served by globally distributed #cloud services + replicated #data with low #network latency.

Your organization may have customers and employees distributed all over the world. These people have neither the time nor the patience to wait for data to travel great distances over land or though undersea cables. If your business is currently serving your constituents via an on-premises network or a regional data center, you’re not being responsive to their needs.

Take advantage of services provided by top-tier cloud providers with data centers distributed throughout the world. Narrow your list based on analyst and other trusted reviews of Platform as a Service (PaaS/Cloud development) and Infrastructure as a Service (IaaS/Virtualization) capabilities. Further narrow the list based on network capacity, redundancy, disaster recovery, data handling, support your existing server operating systems, databases, programming languages, and connectors to line of business systems.

Whether lifting and shifting existing systems or building new ones in the cloud, you must go beyond just load-balancing within one or two data centers. Your websites and APIs must be distributed globally so customers and employees are automatically directed to the nearest data centers for the best performance. Additionally, the databases that power these systems must be replicated everywhere so everyone is looking at the same data. One word of caution I’d pass along is to be cognizant of data sovereignty requirements that may limit data flow to a particular region or country.

Improve user productivity and increase revenue by providing fast and reliable access to corporate data to employees anywhere in the world. What is your organization doing to support its global workforce?

Learn how to digitally transform your company in my newest book, “Mobile Strategies for Business: 50 Actionable Insights to Digitally Transform your Business.”

Book Cover

Click here to purchase a copy of my book today and start transforming your business!

Book Cover

Mobile Strategies for Business is Now Available

I’m pleased to announce that my newest #book, “Mobile Strategies for Business: 50 Actionable Insights to Digitally Transform Your Business” is now available. #mobile

Mobile Strategies for Business is the first book to clearly explain how executives can digitally transform their organization through a simple, step-by-step process.

The mobile tidal wave has permanently transformed the consumer world and now it’s washing up on the shores of the enterprise. This drives the need for an enterprise mobile strategy to mobilize existing applicationsmodernize infrastructuresbuild new apps for employees and customers, and bring order to your environment via enterprise mobility management. Mobile Strategies for Business guides you through this transformation and drives positive outcomes including reducing expensesimproving employee productivityincreasing revenueboosting user engagement and reducing risk.

Based on the top 50 most important enterprise mobility concepts spanning four major topic areas, Mobile Strategies for Business is the first book to clearly explain how to digitally transform your business through a simple, step-by-step process.

You’ll learn how to address the following organizational challenges:

  • How to transform IT infrastructures that are wholly unprepared to deliver on the promise of Mobile and IoT for employees and customers. Learn how to enhance performance, scalability, bandwidth and security to support today’s mobile and cloud workloads.
  • How to reconcile the convergence of the Bring Your Own Device (BYOD) phenomenon and the need to keep corporate data secure. Learn how to support the flexible work styles of your mobile employees while keeping everything safe.
  • How to migrate the millions of out-of-date, insecure and unsupported desktop and Web 1.0 apps that currently run global business to run on modern mobile platforms. Learn how to unchain your line of business apps and web sites from the desktop and move them to the mobile devices your employees actually use.
  • How to rapidly build mobile enterprise apps that run on any platform and work with data from any backend system. Learn how to mobile-enable your existing systems and data to empower your mobile employees and reach out to your mobile customers.
Back Cover

Mobile Strategies for Business is a project plan and an implementation guide allowing your organization to digitally transform so it can ride the mobile wave to employee and customer success. Along the way, it builds a future-looking foundation that prepares your organization for successive technology tidal waves that will impact your business, workforce and customers.

What is your organization doing define and execute on a mobile strategy? It’s time to empower your mobile workforce.

Click to purchase a copy of my book today and start transforming your business!

Azure Security

Getting Started with Azure IoT services: Securing Event Hub Telemetry with SAS Tokens

To prevent the Internet of Things from becoming the largest attack surface in the history of computing, security at scale is paramount. #IoT

Any company that wants to be taken seriously as an IoT platform player has to provide cloud-scale telemetry ingestion while also delivering security to millions of events per second without skipping a beat. This is no easy task and therefore narrows down the field in this space dramatically. Microsoft Azure IoT services accomplishes this task through the use of Shared Access Signatures (SAS). They provide delegated, limited access to resources such as Event Hubs for a specified period of time with a specified set of permissions. Of course it does this without having to share the account access keys you created in the previous Event Hub article. You might remember creating a Shared Access Policy with Send permissions. You gave that policy a name and were given a connection string that includes the account access key which you used to test out a .NET IoT client. Good for testing. Career-limiting for production. That’s why you’re reading this article.

In regards to securely sending telemetry to Event Hubs, IoT devices and field gateways claim access to the Event Hub by presenting a SAS token. This token consists of the resource URI being accessed, and an expiry signed with the account access key. Basically, a URL-encoded string that is passed along every time telemetry is sent. Each IoT device needs its own distinct SAS token and that’s what you’re going to learn today.

To more easily create SAS tokens for your IoT clients, I want you to create a simple app to do the work for you. Launch Visual Studio and create a new C#, Windows Forms application and call it SASToken. From the Solution Explorer, right-click on References and select Manage NuGet Packages…

In the Search Online box type Azure Service Bus and install version 2.7.5 or later. Since you’ll be using the SharedAccessSignatureTokenProvider class to create a shared access signature for your publisher, add using Microsoft.ServiceBus; above the namespace with all the other using statements in the default Form class.

The next thing I want you to do is create a function called CreateSASToken() inside the Form class as shown below:

Create SAS Token

This function simplifies the creation of a SAS token by inputting values found on the Azure portal for your Event Hub. Let’s walk through the parameters of this function and where you can find the required values:

  • EventHubUri: This is found on the Dashboard page of your Event Hub under Event Hub URL. Don’t include the last part of the URL after the final dash /
  • EventHubName: This is found at the top of your Event Hub Dashboard page.
  • Publisher: This is a unique name you get to create for the IoT device that’s sending the telemetry to the Event Hub.
  • PolicyName: This is found on the Configure page of your Event Hub and is the name of the shared access policy you created with Send permissions.
  • PolicyKey: At the bottom of your Event Hub’s Configure page is a section called shared access key generator. Select the correct Policy Name from the dropdown box and copy the Primary Key in the text box below it.
  • Expiration: Enter the number of minutes you want your token to be valid. This TimeSpan code can be changed so you can use days or hours as well.

With the function up and running, you can now create unique tokens for each of your Publishers rather than insecurely using the same connection string for all of them. This also means that your Event Hub can prevent individual Publishers from sending telemetry if any of them have been compromised. To make better use of this function, follow along and build a simple data entry form.

Load the default Form in the Visual Studio and add the following UI controls and associated properties:

  • Label: Text = Event Hub Uri:
  • TextBox: Name = txtEventHubUri
  • Label: Text = Event Hub Name:
  • TextBox: Name = txtEventHubName
  • Label: Text = Publisher:
  • TextBox: Name = txtPublisher
  • Label: Text = Policy Name:
  • TextBox: Name = txtPolicyName
  • Label: Text = Policy Key:
  • TextBox: Name = txtPolicyKey
  • Button: Name = btnCreateSAS  Text = Create SAS Token
  • Label: Text = SAS Token:
  • TextBox: Name = txtSASToken

In order to bring things to life, create a click event for the Button and add the following code:

Create SAS Code

The code calls the CreateSASToken() function you created and passes in the values you type or paste into the TextBoxes. I hard-coded in 60 minutes but you can make that any number you like and you could even add a NumericUpDown control. The function returns a SAS token as a string and displays it in the TextBox at the bottom of the Form.

At this point, go ahead and run the app you just built. Type in or paste the appropriate values from the Azure portal into the TextBoxes. I called my Publisher 007 but you can call it anything you want. Click the button and you should get a SAS token as shown below:

SAS Form

While you now have an easy way to create SAS tokens, this won’t suffice at large scale. You’ll need to use what you’ve learned here to build a secure, on-premises or cloud-based token service to support deployment to thousands or even millions of individual IoT devices.

With your unique SAS token in hand, it’s time to modify the the app you created in the previous Event Hub article. Load the ContosoIoTConsole solution in Visual Studio and get ready to make a few changes.

Just like you did with the SAS token app, add using Microsoft.ServiceBus; above the namespace with all the other using statements in the Program class. Next, delete the first two lines of code inside Main() where you previously created a connectionString and an EventHubClient. In place of the deleted code you’ll declare a string called sasToken and paste in the long SAS token string that was generated by the Windows app you just built. Next, you’ll declare a connectionString and use the ServiceBusConnectionStringBuilder along with your Service Bus URI, Event Hub name, Publisher name, and SAS token to create it instead of reading the account access key from App.config like the previous article. In the final, new line of code, you’ll create an EventHubSender based on this new connection string. Every other line of code below stays the same. Your updated ContosoIoTConsole app should look like the code below with your Event Hub values substituted for mine:

Event Hub Sender Code

All that’s left to do is try it out by running the console app and then checking your Event Hub Dashboard a few minutes later to see if a new message arrived.

By following the directions and code in this article, you’ve made the leap to getting an IoT client to send telemetry to Event Hubs more securely. While Event Hubs has always required transport via TLS, by presenting a SAS token, Event Hubs knows who the IoT client is and what permissions it has. A SAS token’s ability to gain access to Event Hubs doesn’t last forever due to the expiration limitations you place on it when creating a new token which is a good thing. Furthermore, Event Hubs give you device blacklisting capabilities by revoking individual publishers based on the unique name you gave them. Expired tokens and revoked publishers will result in errors being thrown in the client code when a publisher attempts to send telemetry to an Event Hub. Keep in mind that when you do a mass deployment, your IoT clients and field gateways won’t have this information hard-coded like the example we just walked through. It must be encrypted and will often be baked into the hardware silicon as the IoT devices are being manufactured. Stay secure!

 

Microsoft TechEd

Interview with Rob Tiffany at Tech Ed Europe

Check out the interview I did with David Goon at Tech Ed Europe 2009 in Berlin.

I discuss Microsoft’s Mobile Enterprise Application Platform and talk about how it aligns with Gartner’s MEAP critical capabilities and how it can save money for companies.

With the tidal wave of mobile and wireless technologies sweeping across both the consumer and enterprise landscapes, I believe MEAP offerings give us a glimpse of a new standard for designing all future infrastructures.