Common Sense Connected Intelligence for 2020

Connected Intelligence

When it comes to Connected Intelligence technologies like #Mobile, #5G and the #InternetOfThings, I’m all about moving the “Value Needle” as quickly, easily and sustainably as possible. #IoT #IIoT

This means following the same strategy you use when taking a big test. Do the easy stuff first and leave the hard things for later. So how does that apply to connecting your organization’s people and things to drive additional value? Read on as I illustrate some simple examples to get you started this year:

Remote Knowledge

People doing remote work in the field still capture information with a paper and pencil. Then they drive back to the office at the end of the workday to transcribe their scribbled data into their organization’s back office computer system along with a dose of human error. I know you’re thinking this is impossible in the 21st century but I see it all the time. Thanks to the magic of cellular data networks combined with smartphones, tablets and smart, connected IoT devices, this inefficient activity can come to an end.

If the remote activity requires dynamic, person-to-person interaction, data should be captured and validated by a smartphone or tablet app and wirelessly transmitted back to the organization. If the remote activity is an inspection that consists of taking an analog reading, drop the clipboard and utilize one or more sensors to convert analog values to digital equivalents and wirelessly transmit the data back to the organization. While newer machines may include the built-in compute, networking, storage and sensors to get the job done, most of the world is filled with older machines that must be retrofitted with these capabilities. If retrofitting isn’t possible, then visual inspection of an asset can be conducted by a fixed or mobile camera where the photo transmitted back to the organization and computer vision converts the analog image to digital values. 

How does this move the Value Needle?

This is the simplest first step you can take in augmenting and/or replacing an expensive activity with connected intelligence. Without employing analytics, you cut costs by reducing or eliminating remote activities that include travel, vehicles, and fuel expenses just to name a few. You lower your risk and liability by reducing the need to put people in vehicles or having them perform inspections in precarious or otherwise dangerous situations. You gain speed and agility through the instant availability of data allowing your organization to respond to problems and opportunities more quickly.

Connecting Old Things

While we’re all excited about what the future holds, it’s important to interface with the world as it exists in the present. The overwhelming majority of the Things that are all around us have existed for years or even decades. In order to derive valuable insights from these operational technology (OT) machines and environmental systems, you’ll have to connect to them in often unfamiliar ways and learn how to speak their language. Achieving broad success requires you to be comfortable with brownfield projects. In many scenarios you’ll find yourself using low-speed serial cables and communicating via 100+ wire protocols and data formats.

You better have your Rosetta Stone handy. Don’t be surprised if you must interface with a programmable logic controller (PLC) instead of connecting to a machine directly. Oftentimes, the OT folks in factories won’t let you get near their machines. If you want to be successful, you’ll have to make peace with this OT/IT reality.

How does this move the Value Needle?

Many of these existing systems are either unconnected or connected to a closed system often built by the same manufacturer. Imagine dozens of machines on the shop floor individually connected to their own, proprietary analytics systems. By freeing the data found in these systems, you go from having countless data silos to a achieving a blend of machine, environmental, organizational and 3rd party data. This delivers much needed context and allows you to see the big picture across systems of systems to make better decisions.

Bootstrapping New Things 

You know how all the industry analysts throw darts at a board and tell you about the tens of billions of connected devices we’ll have in the coming years? They keep having to backtrack on these predictions because of one very important reason. Bringing the Internet of Things to life is still largely a very manual process of configuring devices, networks, code, platforms and security. IoT today operates like one big aftermarket car stereo store. For the Internet of Things to be the huge success we all want it to be, we have to remove many of these manual processes and leave custom work to those targeting specific use cases.

It must all start when Things are created on an assembly line by original equipment manufacturers (OEMs). The smart, connected machine must have compute, storage, networking, sensors and actuators baked-in from the very beginning. In other words, a microcontroller, cellular module, software, a trusted platform module (TPM) and associated security tokens or certificates from the get-go. This new product will be created in one country, possibly shipped to a distributor in another country, and purchased by a customer in yet another country.

When this smart, connected product wakes up somewhere in the world, it must first make an automatic connection to a local mobile network operator via its cellular module and associated connection management capabilities. Next, it must use that connectivity to access the OEM’s globally-available service and pass along its identity and security credentials. This service will determine what and where the product is, who bought it, and ultimately what IoT platform it should send its telemetry to and receive commands from.

How does this move the Value Needle?

By following this process, much of the friction that’s holding the Internet of Things back is removed. Time can be better spent targeting specific use cases allowing customers get to value more quickly at a lower cost. This better use of human and machine resources will exponentially accelerate the rising tide of IoT that lifts all boats.

Many Edges

I know you’ve heard a lot about Edge Computing over the last several years. As it relates to the Internet of Things, the Edge just means moving compute and associated data filtering, aggregation and analytics closer to the machines and environmental systems that actually create the data. When the IoT megatrend heated-up ten or so years ago, companies that weren’t familiar with the decades of capturing telemetry and controlling systems via M2M and SCADA systems assumed IoT data must go to one of the many public clouds. Unacceptable latency, high broadband costs and data governance issues gave rise to concepts like the Fog and Edge to mitigate cloud shortcomings. Since I’ve spent most of my time in the industrial space, this has typically meant placing edge compute near machines on the factory floor or even on a bullet trains where decisions could be made in milliseconds. Performing computing tasks at the Edge has also worked well for discrete and process manufacturers who say, “the data doesn’t leave my factory.”

More recently, the telecom industry has thrown their hat into the ring by placing distributed compute infrastructure and resources at the edge of service provider networks where “last mile” content and applications are delivered. While this particular Edge isn’t on-prem, the concepts of supporting low latency, data intensive applications still applies since it runs at the edge of cellular networks and is significantly closer to the source of IoT data than distant public clouds. It also provides the benefit of reducing congestion and signal load on the core network, so applications and analytics perform better. This architecture is sometimes referred to by the acronym MEC which can either mean mobile edge computing or multi-access edge computing. Expect to see this type of Edge compute used heavily in smart cities, public infrastructure, faster video games with reduced ping times and vehicle-to-everything (V2X) scenarios.

How does this move the Value Needle?

Moving compute resources to the Edge benefits myriad IoT use cases including split-second application responsiveness, reduction in bandwidth costs and congestion, plus the granular data governance needed to meet local, city, state/province, and country security + privacy requirements. Public clouds still have their strengths. As always, just use the right tool for the job.

Sustainable Side Effects

When you don’t put a person in a car, truck or plane to perform a remote inspection, you’re not burning fuel or congesting freeways. Think of wireless data networks as your replacement for travel. Using smart, connected, new machines and retrofitting older machines ensures you always know about their health and performance so they can operate more cleanly and efficiently. Edge computing allows you to address problems more quickly while alleviating network congestion. As always, think of the Internet of Things as your early warning systemto detect pollution, fires, water leakage, unsafe machinery, excessive electricity usage, deforestation, water contamination, and thousands of other important use cases.

Summary

As you can see, none of the topics I discussed required Machine Learning, Deep Learning, Neural Networks, or any kind of Artificial Intelligence to move the Value Needle. Throughout 2020 I want you to avoid the hype that bombards us from every direction and focus on specific problems that can be solved in your organization through the use of Connected Intelligence. Steer clear of esoteric technologies and concepts that you and your colleagues struggle to wrap your head around.

If you start small, keep things simple, and iterate steadily throughout the year, I know you can knock it out of the park and derive tremendous value for your organization while being sustainable.

An Operational Guide to Digital Transformation for People in a Hurry

If you ever find yourself struggling to digitally transform your organization, business units, divisions, processes, assets & people, you might combine #DigitalTwins with the architecture of a distributed #IoT software system as your guide. #IIoT #DigitalTransformation

I know you’ve been converting things like documents, photos, sounds, numbers, blueprints & measurements from the physical world into the digital world for decades so I won’t belabor this point – you’re going to keep doing that.

From now on, you’re going to create everything digitally first. That means you’re using a computer with a screen, and maybe even a mouse. Okay, I won’t hold it against you if you still scribble notes on airplane drink napkins or draw pictures of early designs on the back of paper place mats.

Also, don’t confuse this endeavor with the business process re-engineering projects of the 1990s. You’re not doing a straight port of your manual processes to digital equivalents. Just because Steve Jobs liked skeuomorphism doesn’t mean your digital processes should reflect the old way of doing things.

Determine every way employees, assets, processes, departments & business units interact with each other as well as how they interact with customers, partners & suppliers in order to digitally convert those interfaces into APIs.

Yes, I want you to create RESTful APIs for every one of those interactions. These are your Digital Threads. Oh, and make sure you version them because an organization is always learning and evolving.

You need to model the properties, artifacts, behaviors, history & events that business units, departments, processes, assets & employees respond to in order to create composite Digital Twins to complete your Digitalization.

Why “composite” digital twins? Because we’re not just talking about atoms here. We’re describing the complex and oftentimes hierarchical relationships that look more like molecules. The human body has important subsystems like the heart, lungs and brain. Your organization is no different.

Your Digital Twins will run in a distributed platform that manages the Digital Threads connecting employees, assets, processes, departments, business units, customers, partners & suppliers.

At this point, your Digital Transformation should be in beta stage & your organization, with all its components, should come to life as a Digital Organism that probably still needs to be debugged.

Notice how I just fast-forwarded through a multi-month or multi-year process? Becoming a Digital Organization is going to take a while so be patient.

An important takeaway is that you’ve now digitally documented how every last aspect of how your enterprise operates. No more reliance on tribal knowledge & the company memory issues you have due to employee turnover are solved.

Everything is made into software & connected via APIs streaming over 5G at a level no one has ever imagined. Digital Twins will collaborate with people & each other via APIs, apps & analytics to run your business.

Terms like CI/CDDevOps will take on new meaning as they’re used to upgrade some or all of the business as developers make improvements. Scaling your business won’t necessarily mean hiring new employees anymore.

Instead, scaling might mean adding containers to a Cloud Native infrastructure. This probably sounds as weird as the first time you heard you had to write a job description for code performing Robotic Process Automation (RPA).

As your digitally transformed organization matures, you’ll be able to determine which functions are best handled by Digital Twins & which ones require human creativity, dexterity, or other unique abilities.

A new level of disruptive automation will sweep through. Don’t go dystopian on me. Like all the preceding industrial revolutions, Industry 4.0 will create more jobs than it eliminates. Lifelong education is the key.

This process is a large mountain to climb & I’m sure you’ve heard hundreds of examples of what it means from other analysts. My definition is a bit more revolutionary because you shouldn’t half-ass something this transformative.

You’re probably wondering, what’s the payoff in making digital transformation a reality? Hyper-efficiency, cost savings, a better customer experience, no more role confusion, no more lost organizational knowledge, & increased revenue via digital scaling.

I hope this synopsis has been helpful jump start to digitally transforming your organization & how it operates with people, processes, machines & connected intelligence.

Digital Trends and Predictions for 2018

With software and adjacent technologies continuing to eat the world, we see the pace of #digital transformation accelerating in 2018 as organizations strive to enhance their customer and operational intelligence.

Organizations will grapple with a variety of digital technologies and skillsets this year to become more data-driven in order to improve their agility and decision-making capabilities. As always, they’ll be looking for ways to simplify operations and get more done with less. We predict the concepts and trends listed below will light a path for organizations to show them the way forward:

  • Climbing the Stairway from the Edge to the Cloud

The ongoing journey to move data, apps and other digital assets from private, on-premises data centers to public clouds will continue unabated as organizations look to reduce or eliminate internal ICT functions and responsibilities. Even in the midst of cutting costs, organizations will still struggle with concerns around cloud vendor lock-in via PaaS which will benefit IaaS virtual machines, container technologies like Docker and container orchestration technologies like Kubernetes, Docker Swarm, Mesos and Marathon. Overall, Amazon AWS plus Microsoft Azure and Office365 will continue to be the biggest beneficiaries of the public cloud megatrend. Along the way, one of the stair steps that remains on-premise is something called the Fog or the Edge. If you’re familiar with how content delivery network (CDN) proxy servers around the world cache and speed the delivery of Web content to your browser, Edge gateway devices do something similar. With more and more of an organization’s compute occurring in distant, public clouds, Edge devices residing on the local network can cache, aggregate, analyze and speed up cloud content to give employees inside the office a better experience. Edge devices can also be used with the Internet of Things where they connect to machines and cache, aggregate, and analyze data locally instead of waiting for that data to be transported to a distant cloud. Since neither people nor machines are vary tolerant of too much latency, expect the adoption of Edge gateway devices and associated local storage to surge in 2018.

  • Enhanced Networking Inside and Out

As organizations reduce the number of digital assets and activities that take place in-house, the primary role of ICT departments will be to create and maintain fast, reliable connectivity via wired and wireless technologies. Wired networking will be “more of the same” as we push speeds forward with fiber optics and Gigabit Ethernet to shuttle employees out to the Internet. Wireless is where things get more interesting. Inside the office, organizations will continue rolling out 802.11ac Wi-Fi access points running in the 5 GHz band to deliver data and high-bandwidth content like HD video to any device. Outside, the 3GPP has officially signed off on the first 5G specification which promises to deliver greater bandwidth, lower latency, better coverage, lower battery consumption and a higher number of simultaneously connected devices. As you might imagine, it will take some time to roll out technology based on this spec so we will look to get more mileage out of 4G technologies like LTE Advanced. On the slower side of things, you have Low-Power, Wide-Area Network (LPWAN) technologies that are making great strides for certain Internet of Things use cases. The ability to create a large wireless network in places where no cellular coverage exits is compelling for organizations capable of managing such a system. If you have devices or machines that don’t send much data every day, require years of battery life, or need to send data over long distances, one of the many LPWAN technologies might be a good fit. Whether you’re inside or outside, looking for narrowband or broadband, there’s plenty of wireless choices for organizations in 2018.

  • Mobility for People and IoT for Machines

While the mobile device revolution has been the biggest megatrend of this new century, the torch has now been passed to the Internet of Things. When you think about it, they’re not terribly different from each other except for the endpoints. Mobile device endpoints are proxies for people and Thing endpoints refer to machines (intelligent or otherwise). They’re both sending data about themselves and other topics of interest over a network. Both interact with apps, analytics and other on-prem or cloud data sources to derive value and business intelligence. In order to regain a level of simplicity and perhaps sanity, organizations will push back against the use of multiple enterprise platforms for Mobile people and IoT machines. Additionally, many organizations will wring their hands of having to understand an alphabet soup of protocols and myriad IoT standards and revert to using the same Web and Internet standards they already understand. Just like they currently do with Mobile and the Web, organizations will insist that IoT sends and receives JSON data to and from URLs over HTTP/REST while being displayed via HTML5, secured with TLS and brought to life with JavaScript. This use of familiar, widely-used, “good enough” Web technologies will win the day over the more advanced but esoteric technologies currently employed by IoT platforms. This move to simplicity and familiarity will reduce friction and help the Internet of Things deliver value and fulfill its promise the way the Mobile, Web and the Cloud have. Expect big changes in IoT for 2018 along with a big shakeout of the hundreds of Internet of Things platform companies.

  • Digital Twins make Everything Digital

The rise of Digital Twins will give every organization the starting point they’re looking for to begin their Digital Transformation. A Digital Twin is essentially a digital representation of a physical object. It can be a machine, a person, a complex mechanical subsystem, a collection of machines working together on an assembly line, or even a process. These twins have attributes or properties that describe them like a person’s heart rate or a motor’s temperature or current revolutions per minute (RPM). Organizations can assign key performance indicators (KPIs) to the current values of these properties. A red heart rate KPI might be 200 whereas a green motor temperature KPI might be 200 degrees Fahrenheit. Digital Twins can exhibit behavior by executing programming language and/or analytics code against the combination of their current property values and associated KPIs. Not only does this bring everything in an organization to life, it also facilitates the running of simulations to see how things will behave when different types of data points are fed to these Digital Twins. This is definitely the most promising and exciting technology for 2018.

  • Security, Privacy and GDPR cause Organizations to Stumble

Unrelenting cyberattacks keep organizations in a defensive posture rather than moving forward with important digital initiatives and deployments. While we won’t cover the myriad security steps every organization must follow in order to stay ahead of individual and state-sponsored hackers, this is one of the most important functions of an ICT department. Organizational leaders who don’t take this seriously by not funding the appropriate security technology or staffing the appropriate security employee headcount do so at their own peril. Needless to say, organizations must prioritize the privacy and protection of data, people (employees and customers), and systems if they want to remain viable. To turn up the heat a bit, the European Union’s General Data Protection Regulation (GDPR) becomes enforceable on May, 25 2018. This regulation gives control back to EU citizens and residents over their personal data by strengthening data protections for all individuals within the  European Union as well as the export of personal data outside the EU. Quite a few companies operating in countries across the globe play it fast-and-loose with the security and privacy of individual data without user consent. This comes to an end in May when companies can be fined  up to €20 million or 4% of their global annual revenue, whichever is greater, for violating this regulation. Any company operating in the EU must obtain explicit consent for all data collected from an individual as well as reason/purpose of using and processing that data. Additionally, that user consent may be withdrawn. Many companies around the world haven’t made the necessary changes to their digital systems to be compliant with GDPR and will be in for a rude awakening in 2018. Data privacy and security matters in a big way.

  • Making Sense of an Avalanche of Data with Advanced Analytics

While data and analytics systems have been around for decades, the amount of data collected for analysis by organizations has increased exponentially. With a 50x growth rate from machines alone, the Internet of Things has become the newest data source for organizations to analyze. Lots of little data integrated from people, machines and business systems adds up to an overwhelming amount of Big Data to make sense of. Luckily, there are an increasing number of streaming and batch analytics systems and tools to tackle this job. Making this trend better is that most of these technologies are open source and free which helps level the playing field between small, mid-sized and large organizations with varying amounts of money to spend. Head over to Apache.org. Another interesting trend in data science is how Python has surpassed R as the most popular language for Machine Learning. An increase on online courseware, an abundance of scientific libraries, and the fact that Python is one of the easiest programming languages to learn, means you don’t always have to be a PhD in Statistics to get the job done. Virtually every organization in the world is looking for Machine Learning/Deep Learning expertise, so this trend should help the supply side of this equation. The last analytics trend that is coming on strong in 2018 has to do with where data is analyzed. It will no longer be the exclusive domain of the cloud or large clusters of servers. The need to answer questions and make decisions more quickly is driving analytics of all types out to the Edge. Thanks to Moore’s Law and the need to eliminate latency, more and more edge gateway devices will be performing IFTTT and even Machine Learning predictions (with models trained in the cloud). There’s no shortage of important trends that are simplifying advanced analytics for organizations in 2018.

Clearly, 2018 is going to be a transformational year where properly-equipped decision-makers and leaders can shift their organization into the next gear to accelerate their digital transformation. Hold on tight.

ROBTIFFANY.COM Named in the Top 100 Websites for IoT Industry Professionals

Top IoT Website

Thrilled to see robtiffany.com included in this distinguished group of the world’s top Internet of Things companies, news sites and individual #IoT influencers and luminaries like Rob van Kranenburg, Peggy Smedley, Stacey Higginbotham and Scott Amyx.

Years of innovating in the IoT, M2M, cloud and mobile industries combined with sharing my knowledge and opinions on https://robtiffany.com has been really rewarding for me.

Rob Tiffany Blog

Check it out at: http://blog.feedspot.com/iot_blogs/

The Cloud is Dead, Long Live the Edge

fog

We interrupt your regularly scheduled migration to the #cloud to bring you a much more important megatrend called the Internet of Things. #IoT

The Internet of Things demands a low-latency, distributed, peer-to-peer environment that can only be found in the fog layer via edge computing.

It’s Time to Dump your 1990s App Authentication

Book Cover

Migrate Win32 applications secured by client/server #database logins to #mobile apps that use OAuth & enterprise #cloud directories for authentication instead.

Do you know Scott Tiger? Are you familiar with SA and no password? If so, you probably worked with client/server database security mechanisms from companies like Oracle, Microsoft, IBM and others. Anyone who’s built client/server, multi-tier database systems over the years has worked with Oracle Net Listener, TNSNames, Sybase DBLIB, ISAM and VSAM drivers plus a revolving door of Microsoft drivers. App logins were typically the same as the database login. DBAs were in control and app developers worked with what they were given. Sometimes data access was secured through the use of views or stored procedures. Things improved when databases started supporting integrated authentication where data access could be controlled by users and groups found in the company Active Directory.

Today’s mobile apps don’t connect to client/server databases this way. Win32 apps connecting via the LAN or VPN can kick the can down the road a bit longer. Everything else talks to databases with web APIs or sync. While these mobile-friendly APIs use database authentication to connect, the services they expose must be secured by an enterprise directory. This pattern provides identity management to mobile apps. Furthermore, cloud-based enterprise directories must be kept in sync with existing on-premises directories to keep the login procedures seamless for employees. Add multi-factor authentication to boost security and avoid consumer auth providers like Facebook or Twitter.

Reduce risk to your organization by decoupling app security from database authentication and make the move to company-wide directory services. Has your employer switched all its enterprise apps to modern authentication methods yet?

Learn how to digitally transform your company in my newest book, “Mobile Strategies for Business: 50 Actionable Insights to Digitally Transform your Business.”

Book Cover

Click here to purchase a copy of my book today and start transforming your business!

Reduce Business Risk by Deploying EMM Solutions with Conditional Access Capabilities

Book Cover

#EMM solutions that deliver conditional access to desired services like email, storage and #cloud services motivate #BYOD #mobile users to enroll.

Let’s face it, your BYOD employees aren’t too thrilled about installing an EMM app, agent or container on their device. It feels like an intrusion on one of your most personal possessions and breeds mistrust. That said, the BYOD world is all about gives and gets. Unless your company enforces a corporate-liable policy and buys every employee a smartphone, a compromise must be made to ensure the security of corporate data. This is where the use of the carrot comes into play.

While the BYOD trend was initially about allowing employees to use their mobile devices for work, the trend has shifted. Now you encourage your employees to use their devices because it makes them more productive anywhere, anytime. Whether your company is just allowing or actually encouraging employees to use their devices for work, you have to overcome the “hassle factor” and suspicions of company spying that deters them from EMM enrollment.

First, your Mobile COE must perform exhaustive due diligence to select the most unobtrusive EMM package available with the fewest steps to install that still meets your company’s needs. Next, this system must prohibit access to the systems, apps and data employees want most until they enroll. Some packages even limit access via MAM functionality. Anyway, if you want email, you have to enroll. If you want to access SharePoint, you have to enroll. You get the idea. Gives and gets.

Reduce risk to your business by restricting corporate system access to only those devices enrolled in an EMM solution. What is your company doing to prevent unmanaged devices from accessing sensitive data?

Learn how to digitally transform your company in my newest book, “Mobile Strategies for Business: 50 Actionable Insights to Digitally Transform your Business.”

Book Cover

Click here to purchase a copy of my book today and start transforming your business!

Reduce Business Risk by Using Employee Smartphones and Multi-factor Authentication to Secure Corporate Resources

Book Cover

The perception that employee #smartphones are a #security liability is misplaced. They’re a  #mobile, multi-factor authentication security asset.

It’s clear the things we’ve done in the past to stay secure are no longer sufficient. The pervasive use of usernames and passwords to authenticate with every kind of system on the planet is breaking down. Passwords aren’t strong enough and no one can remember them all. Some companies require something called two factor authentication in order to access their computer systems. This dramatically increases security because you’re required to have something like a smartcard and know something like a PIN in order to gain access. The downside is that everyone has to have a smartcard with cryptographic information on an embedded chip as well as a smartcard reader plugged into a PC to make this work. How likely is it that everyone on a global scale has this kind of gear? Not very.

It makes you wonder if there’s some kind of device carried by almost every human on the planet that could substitute for a smartcard? Seek out cloud and on-premises systems that work with devices to implement modern security features like multifactor authentication. Now when an employee enters their corporate credentials, the system will call their phone and require them to dial in an additional PIN to prove it’s actually them who’s trying to access corporate resources. A bad actor who may have stolen your credentials won’t have your phone to answer the call or know your PIN. It’s also unlikely they’ll have your face or fingerprint if you’ve enabled biometric security.

Reduce risk to your business by having employees use their smartphones to prove their identity when attempting access to corporate resources. What is your company doing to secure its business-critical resources?

Learn how to digitally transform your company in my newest book, “Mobile Strategies for Business: 50 Actionable Insights to Digitally Transform your Business.”

Book Cover

Click here to purchase a copy of my book today and start transforming your business!

Rob Tiffany Named Among Top 30 Global Technology Influencers in Major Report

Industry Analysts

I’m thrilled to be included in this group of #technology #influencers and luminaries like Werner Vogels, Steve Wozniak and Mark Russinovich.

To become one of those technology influencers, it’s taken a lot of years of hands-on experience building mobile, cloud and Internet of Things solutions combined with writing books, speaking at conferences around the world, blogging, tweeting and mentoring.

Top Technology Influencers

Check it out at: https://apollotarget.com/the-top-15-industry-analysts-usa/