The Dawn of the AI-Powered Web

by New Relic
February 8, 2016
This Week in Modern Software logo

Welcome to This Week in Modern Software, or TWiMS, our weekly roundup of the need-to-know news, stories, and events of interest surrounding software analytics, cloud computing, application monitoring, development methodologies, programming languages, and the myriad of other issues that influence modern software.

This week, our top story concerns Google’s incoming head of search and the future of an AI-powered Web.

TWiMS Top Story:
AI Is Transforming Google Search. The Rest of the Web Is Next—Wired

What it’s about: Google’s parent company, Alphabet, announced its quarterly earnings this week, but the bigger long-term news might be that Google’s SVP for search, Amit Singhal, will retire at the end of FebruaryMost personnel news might seem a bit of a yawn, but there’s a big story here in what his replacement might mean not just for Google but, as Wired’s Cade Metz writes in his story about the change, the rest of the Web, too. That’s because Google quickly named its head of AI research, John Giannandrea, as Singhal’s replacement. “You can … view the pair as the ideal metaphor for a momentous shift in the way things work inside Google—and across the tech world as a whole,” Metz writes. 

AI web; this week in modern softwareWhy you should care: It’s a significant follow-on development to last October’s revelation, first reported by Bloomberg, that Google wasn’t just researching AI but was already using an AI-like system called RankBrain to interpret a “very large fraction” of queries in its flagship search engine. Google, like Microsoft, Facebook, and other tech titans, has been investing heavily in artificial intelligence, machine learning, and related areassomething that Singhal’s search team wasn’t especially thrilled about, according toWired’s story.

With Giannandrea’s appointment, it’s apparent Google’s efforts aren’t destined for its infamous product graveyard. Instead, AI is remaking Google’s rainmaker, search, and likely many other major applications and services, too. That has pervasive impacts on how software gets maintained and updated. Wired’s Metz cites something Giannandrea said to a press gathering at Google HQ last year, well before he ascended to the search throne: “By building learning systems, we don’t have to write these rules anymore,” Giannandrea told reporters. “Increasingly, we’re discovering that if we can learn things rather than writing code, we can scale these things much better.” 

Further reading:

Microsoft Buys SwiftKey, the Latest Deal in AI-Hungry Silicon ValleyThe Wall Street Journal

What it’s about: Microsoft stayed busy this week, highlighted by its announcement that it will spend $250 million to acquire SwiftKey, the London-based maker of the super-popular predictive typing app for mobile devices. And Microsoft has every intention of keeping and growing the app, according to EVP for Technology and Research Harry Shum in his blog post about the deal, not to mention looking for deeper integrations in the Microsoft ecosystem. But here’s the real driver: “SwiftKey’s predictive technology aligns with Microsoft’s investments and ambition to develop intelligent systems that can work more on the user’s behalf and under their control,” Shum writes. Indeed, most observers chalked up this deal to Microsoft’s AI aspirations. It’s increasingly evident that AI will be a major priority in Redmond, too.

Why you should care: Hey, we told you AI would be a hot topic this year (see above item). And here’s more evidence that Microsoft is focusing on innovation: the company is putting cloud data centers 20,000 leagues under the sea, give or take. Dubbed Project Natick, Microsoft last year sunk a 10’ x 7’, 38,000-pound container on the Pacific Ocean floor, housing the compute power of roughly 300 desktop PCs. The purpose? To test the efficacy of full-blown underwater cloud computing data centers designed to be more energy efficient (by using the ocean for cooling) and speed data transmission (by shortening the distance between the data center and users), among other potential benefits. Underwater data centers could also reduce the planning and construction time typically needed for land-based data centers—Microsoft built its Natick container in just 90 days, without the costs, permits, and other demands of a typical construction project. One day, new cloud data centers might just roll off a manufacturing line—and then off a boat and into the ocean.

Further reading:

The Rise of Data-Driven Decision Making Is Real but UnevenHarvard Business Review  

What it’s about: Think data and analytics are just the latest tech industry hype? New research—backed by reliable data, of course—from a team comprised of scholars and U.S. Census Bureau representatives reveals plenty of substance supporting the rise of data, analytics, and the business world’s embrace of data-driven decision-making practices. After an extensive survey of the sector, the researchers found data-driven decision making tripled from 2005 through 2010 in the U.S. manufacturing industry. The report’s authors expect adoption to continue its upward trend: Even with the rapid adoption curve, the majority of firms pegged as “likely adopters” of data-driven decision making still had yet to do so. But adoption is not uniform—rather, the researchers identified four key characteristics of data-driven businesses, which they shared in a recent blog post forHarvard Business Review.

coin toss: this week in modern softwareWhy you should care: While the research focuses on manufacturers, many of the factors fueling data-driven strategies would transfer well to other industries: sophisticated IT departments, highly educated staff (as measured by college degrees), company size (multi-plant manufacturers are more likely to adopt data-driven practices), and high awareness of the potential benefits of a data-driven approach. Clearly, strong technology, smart employees, and a well-informed understanding of the benefits and goals are all great foundations for unlocking the power of data.

The post’s authors, Kristina McElheran (an assistant professor at the University of Toronto) and Erik Brynjolfsson (a professor at MIT’s Sloan School of Management), note there shouldn’t be too much shock at their findings. Virtually every business is an “information processor” these days (we’d have called them a “software business”). “Therefore, it’s only natural that technologies delivering faster, cheaper, more accurate information create opportunities to reinvent the managerial machinery,” they write.

Don’t Panic. Making Progress on the ‘Going Dark’ DebateThe Berkman Center for Internet & Society at Harvard University

What it’s about: A new report from The Berkman Center for Internet & Society at Harvard University appears to significantly change the course of the ongoing discussion on data encryption and “going dark”—in short, the idea that increasing use of end-to-end encryption, including by major tech companies like Google and Apple, will impede government and law enforcement agencies from preventing or prosecuting terrorist attacks and other criminal activities. The study’s conclusion: While the government’s concerns aren’t entirely unfounded, the fear of criminals truly “going dark” is overblown. Market forces, the modern software ecosystem, and other factors make it unlikely that we’re actually entering an age when the bad guys can roam cyberspace undetected. The report’s authors write: “Are we really headed to a future in which our ability to effectively surveil criminals and bad actors is impossible? We think not.”

Why you should care: The technology industry—including some of its largest companies—has been publicly at odds with the government over encryption and privacy matters. That debate will no doubt carry on. The report’s authors write that “we were not able to unanimously agree upon the scope of the problem or the policy solution that would strike the best balance,” but this report brings some useful perspective to the issue. Moreover, the findings are fascinating in their connections to the technical and business realities of modern software.

Businesses are unlikely to encrypt user data en masse, according to the report, because of how valuable that user data has become. The findings similarly suggest that fragmentation—typically bemoaned in the software world—also makes going dark unlikely: “Far more coordination and standardization than currently exists would be required.” The report also predicts that the Internet of Things could revamp the surveillance landscape, and that the metadata generated by today’s devices and systems—our phones, our email—is unlikely to ever be encrypted on a grand scale, ensuring enormous amounts of surveillance data will remain accessible.

Further reading: 

Clever New GitHub Tool Lets Coders Build Software Like BridgesWired

What it’s about: Sometimes, even the biggest and best applications and services in modern software need to be rebuilt. GitHub engineer Jesse Toth puts it this way in an interview with Wired: “As soon as you write code, it becomes legacy code. Somebody has to maintain it, and eventually you will need to change it.” And with that comes a whole host of potential problems, not the least of which is potential downtime. GitHub’s answer? A new, free tool it calls Scientist that enables developers to write and test their new code in parallel with the functioning old code, switching over only when you’re fully ready to do so.

bridge: this week in modern softwareWhy you should care: While Scientist works with Ruby out of the gate, Toth tells Wired that she envisions it eventually working with any code—not just with other modern software staples, but even a truly old language like Fortran. It’s a compelling tool for the long-standing problem of maintaining the health of your current products and services while making major changes to them behind the scenes—not to mention enabling effective testing and a seamless switchover when the new code is ready for production. The tool has been dog-fooded, too: GitHub built Scientist to help upgrade its own repository. Toth equates the concept to building the new San Francisco Bay Bridge in parallel to the old one: It’s smart engineering, and something the GitHub team felt was missing in their own project: “We were having trouble rewriting and replacing the code in a way we felt was safe,” Toth tells Wired. “With thousands of repositories, testing whether one small change or one thing that breaks it was really hard to do.”

Further reading:

We’re Living in the Golden Age of Software DevelopmentInfoWorld

What it’s about: Technically, this story is from last week—but that really doesn’t matter, because Simon Bisson is writing about this entire modern era of software. In his view, we’ve got a lot to be happy about: We’re living and working in a golden age of software development, thanks to incredibly rich choices of tools, languages, architectures, and more. “We’re finally delivering on the decades-old promise of a ubiquitous computing world,” Bisson writes. “But more than that, from the developer’s standpoint, the tools available to us are better and more sophisticated than ever.” Hear, hear.

Why you should care: According to Bisson’s view of the software world, what’s to come will be even better, including a shift away from monolithic applications enabled by microservices and a wealth of other technologies from containers to Platform-as-a-Service tools to configuration-management systems, all moving us into a truly software-defined era. Bisson raves about Node.js, for example, which he calls “one of the most important technologies underpinning modern software,” and cites a slew of others. The bottom line: “There’s never been more choice for developers of all skill levels in languages, tools, services, and platforms,” Bisson writes. “If you want to build a modern app, pick a technology that seems right for your project—and start writing code.”

Want to suggest something that we should cover in the next edition of TWiMS? Email us at [email protected].

About the Author

Kevin Casey is a freelance technology writer and business writer for InformationWeek and other publications, with an increasing focus on IT careers and big data. Kevin won a 2014 Azbee Award from the American Society of Business Publication Editors for his feature story "Are You Too Old For IT?" and was a 2013 Community Choice honoree in the Small Business Influencer Awards. View posts by Kevin Casey.

 

Jobs at New Relic

NYC startup guides

LOCAL GUIDE
Best Companies to Work for in NYC
LOCAL GUIDE
Coolest Tech Offices in NYC
LOCAL GUIDE
Best Benefits at NYC Tech Companies
LOCAL GUIDE
Women in NYC Tech