Seattle, cloudy with a chance of technology upgrades

Finland, home to Nokia and jobless engineers, struggles to fill tech jobs
June 14, 2016
Cisco invests 15 mln dollars in Israeli storage firm Elastifile
June 14, 2016
This post was originally published on this site
The skyline of Seattle, Washington, U.S. is seen in a picture taken March 12, 2014.  REUTERS/Jason Redmond/File Photo
The skyline of Seattle, Washington, U.S. is seen in a picture taken March 12, 2014.

Reuters/Jason Redmond/File Photo

For Seattle, home to cloud technology powerhouses Amazon Inc (AMZN.O) and Microsoft Corp (MSFT.O), the process of upgrading the city’s data systems is moving more at glacial dial-up speed than lightning-fast broadband.

In 2014, Seattle hired a chief technology officer to move its aging data infrastructure to the cloud, meaning using remote servers run by outside vendors.

With more dependable and higher-powered equipment, Seattle could avoid tech malfunctions and free up information technology employees for more important work, such as expanding free internet access.

But almost two years later, much of Seattle’s data is still stuck on older equipment because of lengthy government bidding processes and complicated rules around government data storage, as well as simple bureaucracy.

While Seattle has moved some storage and computing to a shared data center as part of a $40 million overhaul, much will not move until August. And most of its storage and computing will not be done using publicly shared servers such as those provided by Amazon and Microsoft, the heart of the public cloud.

Instead, Seattle will operate on what is called a private cloud, meaning dedicated systems it will not share, in part to meet privacy rules. Many technorati say the private cloud misses the advantages of the public cloud, including the ability to quickly tap into more capacity if needed.

Get it right and local governments can access technology that allows organizations, ranging from video service Netflix to bank Capital One, to process data faster, more efficiently and sometimes at lower cost.

Similar issues to Seattle’s ranging from bureaucracy and labyrinthine regulations have stalled officials in such technology hubs as Palo Alto and Santa Monica in California, and Austin, Texas, among other places. The situation raises the question: If even the city at the heart of one of technology’s biggest transformations is hitting speed bumps, what hope do others have?

“It’s political will and overcoming organizational history, where people have defined their careers by being the keepers and managers of systems,” Todd Sander, executive director for the Center for Digital Government, said in an interview.

While local governments spend $100 billion annually on IT according to the Center for Digital Government, a California-based national research institute, more than dollars is at stake. (

“We have a responsibility for connecting people to their government,” Michael Mattmiller, Seattle’s chief technology officer, said. Taxpayers get more bang for their buck, he said, if upgrades mean IT staff spend less time on routine work such as troubleshooting servers, and more on improving digital services. REGULATORY “RIGAMAROLE”

Seattle’s wake-up call came on a sweltering August night in 2012 when power failed in a downtown building housing key city servers. Backup generators kept the servers running, but workers had to pop out windows and bring in fans to keep equipment from overheating.

Now, equipment will sit in buildings managed by companies that specialize in running data centers.

Seattle, like all cities, also faces hurdles in complying with federal data-storage and confidentiality rules. For example, the U.S. Federal Bureau of Investigation requires a stringent certification process, including fingerprint background checks, before it will share criminal-justice information systems data, or CJIS, with local police. As a result, municipalities such as Seattle have decided to simply keep sensitive criminal data on local storage systems to limit the number of staff and centers that must pass federal audits. “It’s a big rigamarole and generally people don’t try to move CJIS data into the cloud because of the difficulty and cost, but it’s starting to happen,” said Steve Nichols, Georgia’s chief technology officer. Nichols has saved roughly $4 million since 2012 by running a handful of public-facing municipal websites on Amazon’s AWS public cloud.

Entrenched bureaucracy creates another hurdle.

“You are telling a department CIO to give up control of something that traditionally has been theirs,” said Carlos Ramos, who retired as California’s chief information officer in March.

Seattle manages 54 lines of business, Mattmiller said, from police services to an electric utility, with 650 professionals on a $122 million IT budget.

To shift to the cloud, Mattmiller must hold public hearings and take advice from his Community Technology Advisory Board, no members of which have his level of technical expertise.

For municipalities that have overcome the challenges to move computing functions to the cloud, the savings have been substantial. Oakland County, Michigan, saves up to 50 percent annually on server costs, its chief information officer, Philip Bertolini, said.

(Reporting by Eric M. Johnson and Sarah McBride in Seattle; Editing by Ben Klayman and Richard Chang)