Supporting Sustainable Development Goals Is Easier Than You Might Think

Companies and investors are being asked to support the 17 Sustainable Development Goals (SDGs) for 2030 — what some have described as “the closest thing the Earth has to a strategy”— since the public sector alone does not have the resources to do so. At the same time, companies must create value for their shareholders to create the returns they need for their ultimate beneficiaries. In essence, both are being asked to do good and do well at the same time. This raises a number of obvious challenges:

  1. Unlike with financial performance, there are no universal standards for how to measure a company’s environmental, social, and governance (ESG) performance.
  2. As a result, there is a large ecosystem of nongovernmental organizations (NGOs) and data vendors attempting to solve this problem—so many that both companies and investors struggle with which ones to use.
  3. Companies remain skeptical about whether their shareholders will reward them for ESG performance over the long term.
  4. The 17 SDGs, which have 169 “business indicators,” are about improving the planet, whereas ESG metrics are about a company’s performance. What is missing is a way to show how these are related to each other.
  5. Investors remain frustrated with companies that do a poor job of explaining how their ESG performance contributes to financial performance.
  6. Without strong support from the investment community, the corporate community cannot make the contributions necessary to achieve the 2030 goals.

These challenges are surprisingly manageable. The key lies in leveraging the work of the Sustainability Accounting Standards Board (SASB) in the context of the SDGs.


The concept of “materiality” is central to linking ESG outcomes to their impact on SDGs. In financial reporting, material issues are those that are important to investors. Increasingly, these material issues include ESG issues. SASB has identified the material ESG issues in 10 sectors (subdivided into 79 industries) and, through its “Provisional Standards,” has recommended key performance indicators (KPIs) for reporting on them.

While SASB’s industry-level KPIs represent a company’s ESG outcomes, these outcomes also have an impact on organizations and people outside the company, which to varying extents contribute to SDGs. Thus, a relationship between ESG outcomes and SDG impacts exists via the concept of materiality.

The central idea is that we can advance the achievement of SDGs by improving ESG outcomes through this three-step process:

  1. Understand which ESG outcomes are material for a company.
  2. Determine how performance on these outcomes contributes to one or more SDGs.
  3. Track improvements in performance on these ESG outcomes that impact the SDGs.

For example, job creation by a company is an ESG outcome, and the SDG impact would include greater literacy (SDG No. 4 on quality education), since more children can finish school instead of working to support their families. A company’s carbon emissions are relevant to SDG No. 13 on climate action. In what follows, we analyze the relationship between material ESG issues and the SDGs, but we first examine the relationship between all ESG issues and the SDGs.

Material ESG Issues and the SDGs

In order to understand the relationship between ESG outcomes and SDG impacts, we first did a high-level mapping of SASB’s 30 generic ESG issues to the SDGs using a model developed by Himani Phadke and Lauren DeMates of TruValue Labs. We then mapped the material issues identified for each of the 79 industries to each of the 16 SDGs (excluding SDG No. 17). This is the specific guidance corporate leaders will need to understand how they can create value for shareholders and contribute to the SDGs. We computed an Industry SDG Impact Index (ISII) using the ratio between the number of industry-specific material ESG issues relevant to that SDG and the number of all material issues relevant for that SDG times 100. We then calculated a Sector SDG Impact Score (SSII) by averaging all of the industries in that sector. We also calculated this index for all 16 SDGs as the average for each of them (ASSII). In essence, the SSII measures the extent to which a company doing well on the material issues for its sector is doing good by contributing to each SDG.

These calculations showed that for each sector, there are particular SDGs where it has high impact; for each SDG, there are particular sectors that have a high impact on it; and some sectors are more important to the SDGs in aggregate than others. As an example for the first point, the Consumption sector has a particularly large impact on SDG No. 2 (end hunger), No. 4 (inclusive and equitable quality education), No. 13 (combat climate change), and No. 15 (sustainable ecosystems). “The Three Most Important SDGs for Each Sector” shows for this sector the top three SDGs for which it will have the most impact. For point 2, Consumption and Resource Transformation are particularly important to SDG 13. “The Three Most Important Sectors for Each SDG” shows which sector has the most impact on each SDG.

None of these findings is surprising, since it would be expected that some sectors would be more relevant to some SDGs than others and, following from this, some sectors are more relevant to a particular SDG than others. What is more surprising is that a few sectors really stand out in terms of their impact on the SDGs and that some ESG KPIs have a larger impact on certain SDGs than others. The former means that the success of a few sectors will largely determine whether the SDG goals are met. The latter means that while some SDGs will substantially benefit from the private sector “doing well,” others will benefit to a lesser extent.

As shown in “Top-Ranking Sectors for SDGs,” the sectors that are particularly important to the SDGs are health care and consumption, followed by resource transformation and nonrenewable resources. Transportation, services, and financials are less important, although in the case of the latter, our methodology does not capture their role in financing companies and projects that support the SDGs.

We hope this analysis is helpful for two broad audiences. The first is the corporate community, which can use SASB ESG key performance indicators to determine how to do well and good at the same time. The second is investors and NGOs, those focused on a particular SDG or group of SDGs. This analysis will enable both audiences to identify which sectors are most important for investment and collaboration through Public-Private Partnerships, the focus of SDG No. 17.

MIT Sloan Management Review

With a single focus, Intel’s Vaunt has more potential than Google Glass

Back in October of 2013, I got my own pair of Google Glass in order to cover the technology. The site where I worked at the time paid the $ 1,500 cost, and I later spent my own $ 225 to add custom frames that could handle my eyeglass prescription. Given the fate of Glass, we clearly didn’t get a good return on those investments.

Still, there were some things to like about the experience. Glass brought contextual information “closer” to me a relatively non-intrusive way. And that’s exactly what Intel’s smart glasses prototype, known as Vaunt, can do.

When I first read about Vaunt over at The Verge earlier this week, I thought less about the hardware and more about that vision of context and personally important data. That’s because all of our technological advances in mobile computing have impacted this theme.

I look at it this way:

  • In the desktop age, the web brought us closer to data on other computers.
  • Connected laptops brought us closer to data when away from the desktop.
  • Phones put that data in our hand and pocket almost wherever we were.
  • Smartwatches let us wear that data, bringing it even closer
  • Smart glasses can beam that data — at least in the case of Vaunt — directly on our retinas.

Every step of that progression gets us physically closer to contextual information. I suppose the next, or maybe final, step is a Matrix-like jack that simply ports that data directly into our brains, but who knows? Regardless, this is an important theme as more devices around us create gobs of data. The fewer barriers there are between us and the information we want, the faster we can use or act upon it.

And that’s why I’m excited about Vaunt’s potential, perhaps more so than I was about that of Google Glass.

To contrast the two at a high level, Vaunt isn’t trying to take smartphone functions — such as taking photos and videos, a key reason Glass never had a chance of mainstream success — and move them to your eyes. Instead, the product is singularly focused on very specific information that you will want at a specific time and/or place.

That approach has benefits from a hardware perspective too. t’s why you essentially can’t tell the difference between Vaunt and a traditional pair of glasses. They appear to be standard eyeglass frames to both you and the people around you.

Without the need to include a camera sensor, microphone or speaker, the small chips and display components fit inside the frames. Eliminating the camera also allows for a smaller battery since powering an image sensor typically uses a lot of energy. Using a low-powered, single color laser for the retina projection helps with battery life too when compared to the color display used in Google Glass.

By distilling potential product features into essentially one — simple but very useful information — Vaunt actually solves a problem; something Glass sort of did but other extra features came along for the distracting ride. In fact, I don’t see much of a distraction factor with Vaunt because they don’t look like some technological device nor will people even realize that your retina is receiving information.

Clearly, this doesn’t mean Vaunt will be successful. In fact, Intel isn’t even sure of how Vaunt will be used. That’s why the company will be launching an early access program for developers at some point this year. Intel is just providing the technology while developers will provide the functions that they think people will want.

Think of Vaunt then as a new hardware platform with a very limited feature set. That feature is very powerful though: It takes us one step even closer to the information that personally matters most to us..

Stacey on IoT | Internet of Things news and analysis

Five things that are bigger than the Internet: Findings from this year’s Global Cloud Index

The scale of the Internet is awe-inspiring. By 2021, there will be 4.6 billion people and 27 billion devices connected to the Internet, and Internet traffic will reach 2.8 trillion.
IoT – Cisco Blog

Your Data Is Worth More Than You Think

Data has become a key input for driving growth, enabling businesses to differentiate themselves and maintain a competitive edge. Given the growing importance of data to companies, should managers measure its value? Is it even possible for a company to effectively measure the value of its data? An increasing number of institutions, academics, and business leaders have begun tackling these questions, leaving managers with many alternatives for assessing the value of data. None are yet generally accepted, nor completely satisfactory, but they can help organizations realize more value from their data.

Why Is Data Valuation Important?

There are three basic reasons organizations want a good way to understand the value of their data. A good sense of value can help guide good decisions around direct monetization, internal investments, and mergers and acquisitions.

Direct Data Monetization

Many organizations are keen to monetize data directly by selling it to third parties or marketing data products. Inability to understand data’s value can result in mispriced products. Understanding the impact of exposing data to third parties on the value of a company’s data for indirect monetization can help guide the decision on whether to pursue explicit monetization. Today, despite an increasing recognition of potential benefit, most organizations are very conservative about what data they expose outside the enterprise. Good valuation approaches could help leaders understand if selling their data would really affect their competitive position or ability to realize their own benefit from it.

Internal Investment

Understanding the value of both current and potential data can help prioritize and direct your investments in data and systems. In our experience, most organizations struggle to articulate the relationship between their IT investments and business value generally. For data systems, the problem is particularly acute. Surveys report that only about 30% to 50% of data warehousing projects are successful at delivering value. Understanding how data drives business value can help you understand where you should be minimizing costs, and where you should be investing to realize potential ROI.

An ability to articulate data’s contribution to an organization’s overall value can transform the relationship between technology and business management. Chief experience officers (CXOs) charged with managing data report that their ability to articulate business value from data investments with rigor supported by the CFO results in more resources available to drive more positive outcomes for their organizations.

Mergers & Acquisitions

Inaccurate valuing of data assets can be costly to shareholders during mergers and acquisitions (M&A). Steve Todd, an EMC fellow, argues that data valuations can be used both to negotiate better terms for initial public offerings, M&As, and bankruptcy, and to improve transparency and communication with shareholders. Did Microsoft Corp.’s purchase price of LinkedIn Corp. include the value of LinkedIn’s data about professionals and companies? Did they survey potential uses of data in the combined company? The assumption that data’s value is captured only by sales and revenue figures may understate the overall value of a transaction to the benefit of the buyer — and to the detriment of the seller.

Current generally accepted accounting practices (GAAP) do not permit data to be capitalized on the balance sheet. This leads to considerable disparity between book value and market value of these companies, and a possible mispricing of valuation premiums. While internationally agreed-upon standards may emerge in the next five years, the Association of Chartered Certified Accountants (ACCA), the global professional accounting organization, is encouraging accounting companies to come forward with approaches. Wilson and Stenson provide an excellent review of accounting approaches that recognize and value intangible assets in general, and information assets in particular.

Existing Approaches Are Useful, But Limited

Methods for valuing data are varied. Most descend from existing asset valuation or information theory. Some attempt to attribute the value of business outcomes directly to data-driven capabilities. Like statistical models, all have limitations, but some are useful.

Dell EMC Global Services Chief Technology Officer Schmarzo developed the so-called “prudent value” approach, which values data sets based on the extent to which they could be used to advance key business initiatives that support an organization’s overall business strategy. This approach has two main advantages:

  • It provides ballpark valuation (or a range of values) for the data set derived from the financial value of the business initiative.
  • More important, it frames the data valuation process around the business decisions that need to be made to drive the targeted business initiative. It quantifies the ways in which different data sets might be utilized and the impact this could have on the success of the targeted business initiative.

Mapping data to valuable outcomes can fulfill many purposes of data valuation. It supports rigorous ROI arguments based on concrete business outcomes for IT investment decisions. It can also guide pricing direct monetization efforts by relating the business value of the decisions third parties use with respect to data to guide the price they might pay for access.

Some of the most comprehensive work on the subject of data valuation comes from Gartner Inc.’s Douglas Laney. Laney, vice president and distinguished analyst, Chief Data Officer Research, proposes “infonomics” as an economic discipline, arguing that information should be treated as an actual corporate asset — measured, managed, and deployed as if it were a traditional asset. Laney describes six different information valuation methods, three foundational and three financial.

The foundational methods are primarily aimed at businesses that wish to prioritize or create an aggregate of data quality characteristics to get a sense of what its relative or intrinsic value is. These methods force businesses to take stock of their data, how they are leveraging it (or not!), and ultimately articulate its value and evaluate what is and isn’t useful. Laney’s financial measures draw on methods to value intangible assets.

The biggest limitation of Laney’s approach is that it does not tie the value of information to its role in supporting business decisions. His approach is more likely to be useful for valuing data in M&A transactions.

Where to Start

While there is still room for significant improvement in how to value data, current methods can still be useful to enterprises. Organizations should begin efforts to:

  • Create management consensus on how to build business cases for IT investments in data, infrastructure, and capabilities.
  • Use data valuation to prioritize data investments.
  • Begin cataloging and estimating value from existing and potential data-driven capabilities to inform valuation on the public markets or in M&A transactions.

Organizations that become more capable of getting value from data will certainly realize benefits and competitive advantage. Developing the ability to understand data’s value, and contribution to outcomes, is an important part of delivering that value.

MIT Sloan Management Review

How AT&T migrated more than 40,000 users to IBM’s IoT connected products

For any telecom provider, avoiding system downtime is usually job one. So when AT&T decided to migrate more than 40,000 users to IBM’s IoT-enabled connected products—with the goal of supporting internal software development and replace its existing disparate solutions—the company knew it was in for a long slog.

Over the course of about three years, the company used an agile business planning model to pull off this monumental project. AT&T agile tool product owner and team lead Tiina Seppalainen detailed how it all unfolded at the recent IBM Continuous Engineering Summit in New Orleans.

Seppalainen CE Summit

AT&T’s Tiina Seppalainen describes her company’s Rational migration at the 2017 CE Summit.

Connected products buy-in and planning

Seppalainen said the keys to success lied in broad buy-in from all levels of the company and a meticulously planned process for executing and managing the changes at various stages. We had an aggressive and changing schedule, and we did this without any kind of formal training,” she said. “We were given the tools and told to have at it, so that’s what we did.”AT&T had to account for about 3,000 applications that support different parts of the company. “We ended up [affecting] about 100 project areas and 57 servers. And they all needed to be built in time for the migrations to the Rational tools,” Seppalainen said. “It was all planned very carefully and sequentially because of the dependencies between our servers.”

This included setting up between five and 10 scrum teams with a total of about 100 people who were primarily dedicated to the project. “This allowed us to be nimble and change as needed, which was frequently,” Seppalainen said. “We regularly adjusted either what we were doing or when we were doing it, and we had a team to engage user groups and keep people apprised of the progress.”

She added that a critical component of the endeavor was strong leadership commitment at multiple levels and very active support. This included internal communications, webcasts and town halls. “It was very clear from the top down that this was going to happen, and quickly, and everybody needed to support it,” Seppalainen said.

In keeping with the agile model, the migration teams rolled out the new solutions in phases, received feedback from early adopters provided input, and adjusted accordingly. Naturally, such a large organization has many stakeholder groups that wanted different things. Seppalainen said they couldn’t fulfill everyone’s wish list, but they were able to prioritize the most critical requests.

Coordinating the calendar

The volume of users necessitated scheduled releases and automations, which the agile team spread out across the calendar so as not to overwhelm everyone with the build-outs and iterations. (Seppalainen said the crucial help IBM provided in the project area design work was “a key to our success.”) “We had a lot of concurrent and dependent activities and multiple work streams, and they all had to be managed,” she said. “It was a major challenge to coordinate interdependent and overlapping work efforts.”

This meant having frequent meetings, almost all of them virtual, including check-ins at the beginning and end of every work day. “It was a very agile mindset,” Seppalainen said. “That’s what made this happen; just doing that day in day out.”

Testing and training

Once new roll-outs were in place, AT&T conducted extensive testing, established training courses that included certifications, and created “small-bite” videos so users could quickly refresh their skills as needed.

Now that the heavy lifting has concluded, Seppalainen said her company views this massive endeavor as a success. “Using the agile approach was one of the key factors because it’s team oriented, very iterative and gives you the ability to adjust to changing needs,” she said. “Our strong program and project management is what brought this three-year odyssey to fruition.”

To see how you can streamline your organization’s operations and improve productivity with IBMs Continous Engineering and IoT solutions, visit our landing page. And join us at Think 2018, March 19-22 in Las Vegas.

The post How AT&T migrated more than 40,000 users to IBM’s IoT connected products appeared first on Internet of Things blog.

Internet of Things blog