Mountain View-based business forecasting startup Aviso has raised $8 million

image

Mountain View-based business forecasting startup Aviso has raised $8 million Series A round led by Shasta Ventures, First Round CapitalCowboy Ventures, and Bloomberg Beta.

Aviso is building cloud-based applications that quantify risk and provide companies with more accurate earnings forecasts, applying advanced portfolio management principles and machine learning to revenue forecasting.

Founded in 2012, Aviso emerged from stealth today as well.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Advertisements

Santa Clara-based networking company CloudGenix has raised $9 million

image

Santa Clara-based networking company CloudGenix has raised a $9 million Series A round from Charles River Ventures and Mayfield Fund.

CloudGenix’s software simplifies network operations by letting companies manage applications over multiple types of connections to improve performance and lower costs.

Founded last year, CloudGenix is emerging from stealth today and plans to launch a beta product in the coming months.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

New York-based social media analytics company Sprinklr has raised $40 million

image

New York-based social media analytics company Sprinklr has raised $40 million in a Series D round led by Iconiq Capital with participation from Battery Ventures and Intel Capital.

Sprinklr provides tools for brands to manage their social media activities across platforms like Facebook, Twitter, and LinkedIn, and currently serves around 450 enterprise customers.

Founded in 2009, Sprinklr has raised $77 million to date and will use the new funds to continue headcount and infrastructure growth leading up to a potential IPO next year.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

San Jose California based Apigee, has raised $60 million in new funding

image

Apigee, formerly of Palo-Alto California but recently moved to San Jose California has raised $60 million in new funding from Pine River Capital Management and Wellington Management as well as current investors Norwest Venture PartnersBay Partners,Third Point LLC, and SAP Ventures.

Apigee provides API technology and services for enterprises and developers, allowing companies to integrate different apps with existing software.

Founded in 2004, Apigee has raised $171 million to date and will use the new funds to prepare for a potential public offering in the next year.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Apigee, formerly of Palo-Alto California but recently moved to San Jose California has raised $60 million

image

Apigee, formerly of Palo-Alto California but recently moved to San Jose California has raised $60 million in new funding from Pine River Capital Management and Wellington Management as well as current investors Norwest Venture PartnersBay Partners,Third Point LLC, and SAP Ventures.

Apigee provides API technology and services for enterprises and developers, allowing companies to integrate different apps with existing software.

Founded in 2004, Apigee has raised $171 million to date and will use the new funds to prepare for a potential public offering in the next year.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Amazon Launches Wearables Store

image

Amazon (NASDAQ: AMZN) has launched a tech store dedicated to all things wearable. Shoppers can now get products such as smart watches, fitness and wellness trackers, wearable cameras and more at the new online store devoted to the category.

Like Amazon’s other dedicated stores, the wearable technology store will live within Amazon’s main site and provide a selection of brands such as Samsung, Jawbone and GoPro. To help educate consumers about wearables, Amazon has set up a learning center on the site that includes product videos and detailed buying guides. There’s also an “Editor’s Corner” with commentary by tech experts and bloggers, wearable technology industry news and device reviews.

Amazon’s move is the latest in the growing wearable retail market. A recent report by IDC forecasts wearable devices to reach 19 million units sold in 2014. Google (NASDAQ: GOOG) is pursuing the wearable market with its Google Glass computer eyeglasses and Android Wear product line. Android Wear is a version of its Android mobile operating system designed specifically for wearable devices, starting with watches.

Apple’s (NASDAQ:AAPL) entry into the wearable market is still speculative, although researchers predict the company could release its own version of a smart watch sometime next year.

For more:
-See this Amazon press release

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

EBay Pilots New “Click & Collect” program

image

EBay (NASDAQ: EBAY) is creating a pilot click-and-collect program that allows U.S. shoppers to buy goods from eBay online and then select a physical retail location where they can be delivered.

The online giant is currently running the click-and-collect model in the U.K. where it said the program is performing well.

“We’re seeing great consumer engagement with our Argos Click and Collect partnership in the U.K.,” said John Donahoe, president and CEO in an earnings call Tuesday.

EBay launched its click-and-collect program in the U.K. in September. EBay customers there can buy from more than 100 sellers and collect their purchases in more than 100 Argos stores across the U.K.

As eBay continues to test new technology, the company is looking at more ways to strengthen mobile initiatives. EBay said it is working with select retailers to test beacon technology to give merchants a direct connection to consumers via their mobile devices when shoppers enter a store. EBay also said it will continue to add more in-store PayPal digital wallet partners, which will allow shoppers to order and pay ahead and skip the line in-store.

The new mobile programs are being launched for good reason. EBay released its quarterly earnings Tuesday, and said mobile continued to be a major contributor to commerce volume. In the first quarter, the company earned $11 billion of mobile commerce volume, up 70 percent from last year, and added 6.5 million new customers via mobile in Q1. EBay also highlighted its mobile partnerships with Samsung and Deutsche Telekom for driving innovation in mobile during the quarter.

Overall, eBay reported a first-quarter loss of $2.3 billion due to a tax charge on foreign earnings. That compares with net income of $677 million last year. Revenue increased 14 percent compared with the same period in 2013, to $4.26 billion.

PayPal was once again eBay’s strongest segment, growing net total payment volume by 27 percent and revenue by 19 percent year-over-year. Revenues on the marketplace side were up 10 percent.

For more:
-See this eBay earnings call transcript

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Cracked: The infamous Atari E.T. game mystery

image

It was certainly the stuff of Urban Legend only this time it quite a bit more.  The story goes that Atari, facing huge losses and an even bigger hit to its reputation in 1984 buried what amounted to the gaming industry’s Edsel – the E.T. the Extra-Terrestrial game.  The company supposedly buried “millions” of ET.. cartridges in the desert.   Years later in an effort to prove this story was true, Microsoft’s Xbox group set out to find and dig the cartridges up as part of  a documentary it is producing with Fuel Entertainment and Lightbox called “Signal to Noise.” Over the weekend the quest was wildly successful indeed finding the Atari stash buried in a long-closed Alamogordo dump.  Here’s a look at the event.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

The data center of 2025 revealed

image

A survey that challenged IT managers to imagine the data center of 2025 offers up some optimistic, even surprising, findings.

About 800 IT data center managers globally responded to the Emerson Network Power survey and three of its major findings foretell major changes ahead:

First, by 2025, data center managers expect nearly 25% of their power will come from solar energy, which today accounts for about 1% of a data center’s energy supply.

Second, in 10 years, nearly three-quarters of the respondents believe that at least 60% of computing will be cloud based.

And third, 58% of managers expect data centers will be smaller in 10 years: 30% predicted they’ll be one-half the size of today’s data center, 18%, one-fifth the size, and 10%, one-tenth the size.

The renewable energy finding may be the survey’s most striking. Today, solar, wind, fuel cells, geothermal and tidal energy sources account for no more than 10% of a data center’s power. But in 10 years, these renewables will account for 50% of a data center’s power, with fuel cells accounting for about 11% of that 50% figure.

That confidence in renewable energy may be optimistic, and indicates that managers “are imaging some fairly large technical breakthroughs that are going to happen in the renewable space,” said Steve Hassell, president of data center solutions for Emerson Network Power. Either that or they are as clueless as the rest of the population when it comes to believing marketing hype of renewable energy zealots.

Currently, a square meter solar panel can generate 800 kWh per year. Supporting power densities of 6.4 kWh, which is near the average for a rack, requires eight square meters of solar panels — more when cooling is considered. But the use of solar is increasing. Apple, for instance, has a 100-acre solar array to help power its data center in Maiden, N.C.

The current trend in rack densities is holding at 6-to-8 kWh, said Hassell, although there are data centers with racks that that use as much as 40 kWh.

“We haven’t seen a dramatic increase in rack density,” said Hassell. But the survey respondents expect that will change, with 26% predicting power densities of 80 kWh in 10 years, and another 15% who envision 100 kWh densities.

These figures, if true show that the respondents are not considering these power increases when also responding their beliefs in renewable energy capacity which, if the density and power requirements increase as predicted would basically take the renewable energy back down to its current percentages without some totally unknown breakthroughs occurring of which we are not aware.

The survey also picked up a shift in private power generation, especially in regions of the world where the grid is less reliable.

Apple’s private power generation efforts, illustrated by its solar farm, may become more common in the data center industry, at least among large “hyperscale” providers. Google’s decision to locate a data center near a hydroelectric dam in Washington State might fall under the private power generation definition as well, said Hassell.

Those surveyed also had a lot of confidence in a self-healing, full visibility, self-optimizing data center, which may arrive with the increasing adoption of software-defined data centers and data center infrastructure management.

The respondents were broken out by region, and among those from the U.S., half expect to still be in the data center business by 2025, with 37% saying they will be retired by that point. In the Asia Pacific region, only 10% of the respondents said they would be retired in 10 years.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

 Netflix cuts deal to pay Verizon for direct access

image

This is the way the free market works… It should not be messed with. Especially by any government entity IMO – JNR

Even as it continues to wage war with Comcast (NASDAQ: CMCSA)–including openly opposing the $45.2 billion merger between the nation’s top MSO and Time Warner Cable (NYSE: TWC)–Netflix (NASDAQ: NFLX) is paying to improve its working relationship with Verizon (NYSE: VZ).

According to multiple reports, Netflix has agreed to pay Verizon to guarantee faster access on its broadband networks, ensuring that Netflix and Verizon customers get a better streaming experience. Neither side would say how much this will cost, but a Netflix spokesperson confirmed the deal totechradar.

“We have reached an interconnect arrangement with Verizon that we hope will improve performance for our joint customers over the coming months,” the spokesperson said.

Which, of course, begs the question of why Netflix has been openly rancorous about paying for better access to ride on Comcast’s networks, as evidenced by a recent blog item written by Ken Florance, vice president of content delivery at Netflix, complaining that Comcast was forcing Netflix to pay for better access.

As part of the blog item, Florance produced a chart “which shows how Netflix performance deteriorated on the Comcast network and then immediately recovered after Netflix started paying Comcast (for direct interconnection) in February.”

Comcast isn’t the only broadband pipe with which Netflix takes issue. In a letter to shareholders accompanying second quarter earnings, Netflix CEO Reed Hastings and CFO David Wells said AT&T’s (NYSE: T) fiber-based U-verse service “has lower performance than many DSL ISPs such as Frontier, CenturyLink and Windstream” and suggested “it is free and easy for AT&T to interconnect directly with Netflix and quickly improve their customers’ experience, should AT&T so desire.”

For more:
– techradar has this story
– and Bloomberg has this story

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Target Names New CIO; as it tries to clean up its image from massive data breach

image

Target (NYSE: TGT) is moving forward from its massive data breach with the addition of a new CIO. The company announced today that Bob DeRodes, who has advised the U.S. Department of Homeland Security, the Department of Justice and the Secretary of Defense, will become CIO on May 5.

Target’s previous CIO departed from the company in early March amid the retailer’s recovery from the cyber attack that compromised payment data of more than 70 million Target shoppers during the holidays. While the top information post has been filled, Target is still also actively searching for a chief information security officer and a chief compliance officer to round out its data security team.

DeRodes comes to Target with more than 40 years of experience and is a recognized leader in information technology, data security, and business operations. The security exec has held top technology positions at a number of multinational companies including CitiBank, USAA Federal Savings Bank, First Data, Home Depot and Delta Air Lines. 

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Redwood City-based business financial forecasting startup Tidemark has raised $32 million from a VC group including Andreessen & Horowitz

image

Redwood City-based business financial forecasting startup Tidemark has raised $32 million in additional funding from Silicon Valley Bank and existing investors Greylock PartnersAndreessen HorowitzRedpoint Ventures, and Tenaya Capital.

Tidemark delivers digital financial board books to companies, allowing them to view, share, and comment on financial information in the cloud as well as on mobile devices.

Founded in 2009 but launching its new product Playbooks in the coming months, Tidemark will use the new cash to fund product development and expand sales and marketing to new regions.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Palo Alto-based API and analytics provider Apigee has raised $60 million

image

Palo Alto-based API and analytics provider Apigee has raised $60 million in new funding from Pine River Capital Management and Wellington Management as well as current investors Norwest Venture PartnersBay Partners,Third Point LLC, and SAP Ventures.

Apigee provides API technology and services for enterprises and developers, allowing companies to integrate different apps with existing software.

Founded in 2004, Apigee has raised $171 million to date and will use the new funds to prepare for a potential public offering in the next year.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

San Francisco-based cloud app monitoring company New Relic has raised $100 million

image

San Francisco-based cloud app monitoring company New Relic has raised $100 million in a new funding led by BlackRock and Passport Capital with participation from T. Rowe Price Associates and Wellington Management.

New Relic provides an all-in-one server and application performance tool for developers to monitor and manage cloud applications.

Founded in 2008, New Relic has raised nearly $275 million to date and will use the new funds to accelerate global expansion and develop both new and existing products.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

SaaS player Alert Logic pitches up in Cardiff

image

US Security as a Service specialist Alert Logic has pitched up in the UK cutting the ribbon on a datacentre and operations centre in Cardiff as it opts to provide more support for its European customers.

Cardiff has been chosen as the base for a UK hub with the firm expecting to recruit employees from the local area, which helped Alert Logic secure a developmental grant from the Welsh government.

The establishment of a UK operation will provide a base for sales, marketing and support teams and the location of a data centre that will serve customers across EMEA and it will build a security operations centre to provide 24 x 7 monitoring of the threat landscape.

“Data from our recent  Cloud Security Report noted Europe as having the highest volume of attacks in hosting and public cloud infrastructures. European customers are looking for high-quality, cloud-based security and compliance solutions that can address their unique needs,” said Gray Hall, CEO, Alert Logic.

“With over 300 Alert Logic customers in the UK already, we are excited to announce today that Alert Logic is expanding into Europe with the goal of providing better support for our current and future customers,” he added.

Alert Logic warned last week that cloud threats were on the rise, with attacks on the hosted environment starting to resemble the activity in the physical world, and called for more awareness of the situation.

To get its information the firm established a number of honeypots globally to get a picture of what was happening and found that the volume of cloud attacks was higher in the Europe than the US, by up to four times, and around 14% of malware it collected would have slipped past many of the anti-virus products on the market

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

IBM launches partner cloud marketplace at Impact conference

image

IBM has unveiled a new Cloud marketplace at its 2014 Impact conference in Las Vegas, offering partners a means to get their paws on software and services from IBM and third party vendors, and help resellers tap into an estimated $250bn revenue opportunity.

Big Blue claimed the launch of its Cloud marketplace marked the next major step in an unfolding strategy to build “the most comprehensive cloud portfolio for the enterprise”.

DIGITAL VISION

So far this year it has already ploughed billions of dollars into the cloud via the expansion of its cloud datacentre network, the launch of its middleware PaaS offering Bluemix, and an ongoingacquisition strategy.

Its marketplace will bring together such various disparate elements to serve as a self-proclaimed “digital front door” to the cloud.

“Increasingly cloud users from business, IT and development across the enterprise are looking for easy access to a wide range of services to address new business models and shifting market conditions,” said Robert LeBlanc, SVP of IBM Software & Cloud Solutions.

“IBM Cloud marketplace puts big data analytics, mobile, social, commerce, integration – the full power of IBM-as-a-Service and our ecosystem – at our clients’ fingertips to help them quickly deliver innovative services to their constituents.

Partners will get access to both IBM’s own IP, services and software capabilities, collaboration opportunities with peers, and the firm’s enterprise client network.

It will include the full suite of IBM-as-a-Service apps, the Bluemix platform, Softlayer IaaS and other third party services from emerging software partners such as Deep DB, Flow Search and SendGrid among many others.

Andi Gutmans, CEO of web and mobile app development services firm Zend, which is among the launch partners to have products available through the marketplace, commented: “

IBM has brought together a full suite of enterprise class cloud services and software and made these solutions simple to consume and integrate, whether you are an enterprise developer or forward looking business exec.

 “We will support the rapid delivery of our applications through the IBM Cloud marketplace, enabling millions of web and mobile developers, and many popular PHP applications to be consumed with enterprise services and service levels on the IBM Cloud,” he added.

“Most cloud marketplaces are tied to one specific product offering. If you don’t use the particular service for which the marketplace was built – even if you’re a customer of other products by the same company, that marketplace is irrelevant for you,” explained Jim Franklin, CEO of email services outfit SendGrid.

“But the IBM cloud marketplace will be available to all IBM and non-IBM customers. As a vendor, being able to reach all IBM customers from one place is very exciting.”

Separately, IBM also expanded the range of services available over the Bluemix PaaS offering, including cloud integration, internet of things services, data analytics and DevOps services.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

 Walmart paid $334 million to end India partnership

image

Walmart (NYSE: WMT), which ended a 6-year-old joint venture with Bharti Enterprises last year, spent $334 million to sever ties with the Indian company.

Walmart had earlier paid $100 million to take over its Indian partner’s 50 percent stake in Bharti Walmart Pvt Ltd., which runs 20 wholesale stores under the Best Price Modern Wholesale brand.

In October 2013, Walmart called off its six-year-old partnership with Bharti Enterprises and decided to operate wholesale stores independently in India. The transaction resulted in a net loss of about $151 million, according to the company’s annual report.

Walmart has been trying to rebuild its India operations since ending its partnership with Bharti Enterprises. Walmart said that the Indian government’s regulations requiring foreign retailers to buy 30 percent of products from local small and midsize businesses were the “critical stumbling block” to opening Walmart stores there.

In April, the company said it was making progress, with the announcement that it will open 50 cash-and-carry stores in India over the next five years. Walmart also said it will launch a B2B e-commerce platform for members of Best Price Modern Wholesale Stores.

Walmart has 20 wholesale stores in 15 cities in India under the Best Price Modern Wholesale nameplate. These stores are not open to consumers, but restricted to other retailers. The website will be an exclusive virtual store for its members with a similar assortment of products, as well as special items. Best Price Modern Wholesale has roughly one million registered members.

For more:
-See this Reuters article

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

What Day is it?

image

The Second “Internet Revolution” and being in the “Eye of the Storm”

image

I have been blessed. I went from cleaning pools in Arizona in 1993 to working at the center of the Universe in 1994 when I was hired after a thirty minute phone call, and asked on a Thursday, if I could be at work in Silicon Valley by Monday at the most exciting startup in Internet history “Netscape Communications Corp.”.

I threw everything I owned into my metallic maroon 89 Z28 and drove there in 10 hours straight and didn’t leave for twenty years… I left Netscape in 99 after five of the most unforgettable years of my life. I started at the bottom and worked my way to “Manager of World-wide Support Operations” and have never only occasionally looked back…

Okay you might say, but there were almost four thousand other people there in the “Eye of the hurricane” that was Netscape, and you’d be correct. What really makes me blessed is I am now in the center of a “Tsunami” that may just make my experience there seem tame (with weekly beer bash music concerts, pets, futons, pools tables, a replica of the Golden gate bridge made of beer and soda cans, life-sized Mozilla Dragon, and kegs of beer in cubicles where long haired, sandal wearing millionaires worked side by side with suite wearing IBM types….. I DOUBT IT) OK but still… Twice in one lifetime??
By Jarrett Neil Ridlinghafer

image

The following excerpt was written
By Kurt Marko
The term “cloud” is by now so overused that most people lump it in with the rest of the marketing buzzword pantheon with terms like “solution”, “leverage” or “ROI.” You’re probably thinking that the cloud is what Google and Amazon do and how could it possibly be relevant to your business? Well it is (what Google does) and it will be (important to every business) and VMware wants to be the company that delivers it. It’s called the software defined data center (SDDC) and for CEO Pat Gelsinger it’s increasingly a part of discussions with his C-level customers.

As I wrote in an earlier column outlining the four pillars of the SDDC, “It’s an expansive vision that could put VMware in the middle of enterprise IT application and service design, construction and delivery. Indeed, if fully adopted by VMware’s customers, … it would make VMware the hub around which all business application and IT infrastructure decisions revolve. Meaning that in an era of cloud computing and software as a service, the SDDC could do for VMware what the Windows family of PC and server software did for Microsoft in the age of client-server computing.”

Talk to Gelsinger as I did in an exclusive interview after he delivered a keynote address at the annual Interop conference and you can sense both his passion for the subject and depth of involvement. This Stanford-educated engineer who once ran x86 development projects for Intel is clearly no hands-off financial manager, as evidenced by his deep understanding of the technology and the business problems it addresses. Read on and you’ll see.

Realizing the Software Defined Data Center

Kurt Marko: The keynote was a good strategy about the four pillars of the SDDC. I’m wondering if you would like to talk a little bit more about kind of where you see VMware playing in each of those and where the company is as far as its journey in each of them. Clearly, compute was the foundation where you’re furthest along. Networking you’re covering with NSX. And the other two area, maybe even more immature. Just talk to that.

HomeNew PostsPopular

Kurt Marko, Contributor

I analyze new technologies and their business implications FULL BIO

TECH

 

4/11/2014 @ 9:30AM |898 views

Creating the Enterprise Cloud: A Q&A with VMware CEO Pat Gelsinger

Share

The term “cloud” is by now so overused that most people lump it in with the rest of the marketing buzzword pantheon with terms like “solution”, “leverage” or “ROI.” You’re probably thinking that the cloud is what Google and Amazon do and how could it possibly be relevant to your business? Well it is (what Google does) and it will be (important to every business) and VMware wants to be the company that delivers it. It’s called the software defined data center (SDDC) and for CEO Pat Gelsinger it’s increasingly a part of discussions with his C-level customers.

As I wrote in an earlier column outlining the four pillars of the SDDC, “It’s an expansive vision that could put VMware in the middle of enterprise IT application and service design, construction and delivery. Indeed, if fully adopted by VMware’s customers, … it would make VMware the hub around which all business application and IT infrastructure decisions revolve. Meaning that in an era of cloud computing and software as a service, the SDDC could do for VMware what the Windows family of PC and server software did for Microsoft in the age of client-server computing.”

Talk to Gelsinger as I did in an exclusive interview after he delivered a keynote address at the annual Interop conference and you can sense both his passion for the subject and depth of involvement. This Stanford-educated engineer who once ran x86 development projects for Intel is clearly no hands-off financial manager, as evidenced by his deep understanding of the technology and the business problems it addresses. Read on and you’ll see.

Realizing the Software Defined Data Center

Kurt Marko: The keynote was a good strategy about the four pillars of the SDDC. I’m wondering if you would like to talk a little bit more about kind of where you see VMware playing in each of those and where the company is as far as its journey in each of them. Clearly, compute was the foundation where you’re furthest along. Networking you’re covering with NSX. And the other two area, maybe even more immature. Just talk to that.

Pat Gelsinger: Sure. Let’s cover the spectrum. Obviously, starting with compute, it’s sort of a birthright at that point and we’re going to keep plowing ahead forever. You know, as I drive my team, 100% virtualized with very high market share. We’re not done, and you’ll see us just continue to pound. You know, where are people not virtualized today? Right? There’s a handful of places. The very big apps, right? So we’re going to keep driving on the mission critical. Those [applications] are performance-centric or for whatever reason have been, I’ll say, hardware-centric. Some of those would be Hadoop, some of those would HPC. And we announced our big data extensions to start attacking the Hadoop problem more directly, every release continuing to drive down latency. … [We’re] continuing to work with customers to get more of their P2V’s [note: physical to virtual machine migrations] done. I continue to be amazed — I go into big accounts, and they still have 20,000, 30,000 physical servers. It’s like, what’s going on here, guys? It’s just, you know, quite amazing

Q: So, expanding the scope of what is appropriate for virtualization

Pat Gelsinger: Yeah, just keep on that path. And then the next one, you know, the one lore of the compute virtualization I see opening up is network function, virtualization, the whole telco space. There’s approximately 6 million servers in that infrastructure, of which almost none of which are virtualized today. So that’s sort of another greenfield opportunity for us to go pursue. So that sort of is the compute piece of it [SDDC].

From Server Virtualization to Infrastructure Management

Pat Gelsinger: The next one I’ll go to is management… If you think about it, vCenter is probably the most prolific management tool in the history of the data center.

Q: It’s spawned its own ecosystem.

Pat Gelsinger: Right. Plug-ins, and enhancements to it, and so on. But now it very much is filled out that whole clan of cloud management stack, and that’s a reasonable size business for us today. And we think about it in three buckets: cloud provisioning, cloud operations, and cloud financial management. … Most customers that we’re engaging with today are really the combination of vSphere and cloud management. … Our picture is everybody has a problem here [with cloud management]. Every customer, they’ve got some legacy BMC, they’ve got some point tools here and there, they’ve got CA and Tivoli, they’ve got ITIL this. Every customer needs to transform their management environment to one, create much more automated operations, but also to create the environment that they can be hybrid and multicloud as they go forward. So that really resonates with customers, and you’ll see us keep building on that product suite.

Q: I was reading through the last conference call (Q4 2013) and it sounded like you’re getting a lot of momentum there. I think it was Carl [President and COO Carl Eschenbach] that cited 50% of your new licenses and 70% of revenue are vCloud suite now. That was kind of a surprise to me; that it was so high.

Pat Gelsinger: Yeah, and when we say vCloud suite, we generically say all four legs of SDDC. But the real, and that’s what I mean, the meat and potatoes is vSphere plus cloud management. Those are the ones that are really moving the revenue needles today. And that’s just going to keep getting tighter and tighter for us as we get those better integrated. You know, we had DynamicOps [automation software acquired in 2012] was one of the key products. They were on a different trajectory and we’ve had to sort of meld them into rest of the suite, so there’s things that need to be better integrated, more elegant for customers, broader support for use cases, so just a lot of work to do there.

Virtual Networks: The Current Battleground

Pat Gelsinger: Then continuing, obviously NSX has the center theme of today’s [Interop] keynote, and I think about this as the next big pillar to move, because so much of the provisioning automation limitations of the data center are around the network. VMs spin up very quickly, but I can’t get the network services, I can’t provision the routing tables, I can’t get the firewall rules in place, so I can’t deliver the new app. And that really has become a pain point that really resonates with customers when we talk about NSX. Just the rigidity of the network.

Q: Like the WestJet example [video presentation during the keynote from Richard Sillito from WestJet] and your other testimonials?

Pat Gelsinger: This is powerful for customers as well. So it is the next big one for us, but when we think about 500,000 customers for VMware, we’re really just touching a subset of the enterprise customers today. We had them last year, we 100-plus customers on the [NSX] platform now. We keep adding customers onto the platform. But there’s just a long way to go here, because it is very transformational. You don’t go in and say to a guy who’s been pushing packets, running pings and trace routes, right, and say, “Here’s a new way to provision your entire network.”

Q: It’s transformational at an organizational level as well, because the people using your suite are typically not network guys, they’re server guys.

Pat Gelsinger: Right. And they’re increasingly becoming those. Our typical V -admin or V-architect, they’re sort of stretching out because they’re tired of running over to the networking guy and say, “Here’s what I need you to do,” and getting the answer, “It will be in the second network upgrade a month from now.” And not being able to operate at the business kind of speed that they want to operate. And then when they finally do it, they screw it up and now you’ve got to wait for the next network outage. You know, this kind of provisioning, this manual provisioning and update cycle is very pained. Now when they can start introducing things like NSX, I’m just saying, “It’s done. I was able to create the virtual east-west firewall and I’m done.”

From Networks to Storage: the Virtualization of Everything

Pat Gelsinger: So NSX is the next leg. And then VSAN was obviously our big launch a few weeks ago for virtualizing storage, and that’s gotten … I was surprised. I wasn’t expecting that we’d get two awards at Interop for V-SAN.

Q: I wasn’t judging that Best of Interop category, but I heard similar surprise from people.

Pat Gelsinger: I just GA’d the product two weeks ago, and it’s sort of like, wow, this is pretty good. So that was a very pleasant surprise, and the industry resonance to that, again it’s taken the V-admin, and rather than him having to go through a laborious process to get LUNs provisioned, now he essentially is in control of being able to have to have VMDK storage that can be spun up and managed in a dramatically elegant way. … Being able to now have this fully automated provisioning against workloads is very powerful. And we do think that whole model will complement a lot of your traditional storage arrays. As we see the world, that storage was really a one-appliance-needed-to-fit-all world, and it’s becoming increasingly separated into the hot edge and the central store. If I’m moving the hottest workloads onto the edge cluster, that means that the central storage appliances become increasingly focused on capacity, because performance is being delivered by these cluster storage models, VSAN on the edge, so that you can actually start driving a very different cost-performance dynamic into the central storage requirements.

The last comment of this; I feel good ending Q1, because I now have the full SDDC components in place. We laid out the vision about a year and a half ago, and the vision resonates. We talked to a customer, the conceptual sale is done in the first two minutes of the customer meeting. Now I can actually execute on all the legs of it, and really say, “Yep, we have all the components and we can truly change the way that you build data centers,” and that’s exciting.

image

The following is written
By Jarrett Neil Ridlinghafer

It’s a very interesting interview and while I like Vmware and respect their products and indeed I even consider them the best at what they do, and I also believe they have contributed a lot to the world of “Data-Center Infrastructure” since their first 20 floppy-disk product in the late 90’s (which was horrendous to experience at the time) as an owner of my own ISP and Fiber Optic Data-Center, the remarkable thing is just how far they’ve come in those twenty years. And yet I do see some flaws in their strategy. The biggest is the fact they’ve grown too big and now they’re “stuck in the rut” of the behemoth corporate mentality and too big to truly innovate. Like IBM one of the true innovators in its early years, they’ve become followers of innovation and trends rather than the leaders they once were. I am a firm believer that the true genius comes from small teams and individual investors, researchers, inspired programmers sitting alone in their dark closet of a room, buried in empty cans of energy drinks and strewn with candy wrappings. Not corporate Giants who can barely make a decision without a dozen meetings followed by a “working group” to “investigate” the idea… Many of you are nodding your heads and laughing right about now I imagine 🙂

I am currently working with a true visionary in the Data-Center space, someone who sees the future in the past, they recently explained to me the six year project they’ve been working on to build a truly software defined, 100% Uptime, self-healing clustered Data-Center, based on the concept of the “mainframe”. I can’t say more than that without violating a trust. So will leave it there for now however, I also recently read an old whitepaper entitled “The Software Defined Internet” and both made me think of my vision of the “cloud” from five years ago.

In the coming months I will be writing about the amazing Infrastructure he and his team have been building out in eighteen pop’s around the world, in secret & the new plans they have for a completely new Data-Center OS, how it’s being architected and when they plan to go live with their vision as a product.

You will only read and hear about it here, since I have exclusive access.

I think you’ll find their vision may just be what your vision is also, when you think of the future the “Cloud” will usher in, at least if your vision is anywhere near as big as mine is.

Let me tell you my vision of the cloud so that you’ll see just how truly game-changing their new Infrastructure really will be since I’m now convinced, that what they are building must be the first step on this journey to the ultimate future of technology many dream about.

Recently I had the opportunity to explain to someone whom I consider a close friend, what five years ago I started describing as the “Cloud future” and it begins with a bang… “The cloud will be bigger, than the Browser was for the Internet” (sorry @pmarca). I know that is a HUGE statement for those of you who realize what I truly just said, what a statement like that truly entails. But the browser has made the Internet what it is today, without it (and I don’t believe the theory that if Marc Andreessen had not invented it, someone else would have, that’s a cop-out by those who have no vision and no inventive bone in their bodies, to not give credit where it’s due) there would be something completely different today & don’t ask me, I have no idea 🙂

The Future Internet
When I envision the future Internet, I think Star-trek and more. What’s amazing is how many ideas we watched on Star-Trek have already come into being and are actually being used today. There are a lot more however which have yet to be realized and I believe the “cloud” will take us there:

Holographic Video

Rooms where you actually “live” as part of a video game

Holographic images which are so close to real you won’t be able to differentiate the two.

Flying, totally automated vehicles in every garage

Robots who look and feel and act exactly like humans

Life spans that reach hundreds of years
Every appliance connected and intelligent

Every habitat totally self sufficient and with the cpu power of a thousand IBM “Big Blue” super computers and the intelligence of a billion human brains

But… It all begins with a Software Defined Internet where there is Unlimited CPU access/power, unlimited Memory access, Unlimited storage space and ZERO downtime.

I believe that we are nearly there and that “Cloud Technology” (no matter that I agree it was originally a Marketing Hype Term, the fact is its now the future) will be the technology that takes us there, I became a true believer in that once I truly grasped just how powerful and awe inspiring the actual potential is that it brings to the table. And even though I was right in the thick of the “Internet revolution” during the 90’s and nothing will ever compare to my five years at the center of that tsunami at Netscape.

I cannot wait to see and be part of this next killer wave… “The Cloud” and continue to be at the center of this obvious “second Internet revolution”  as it happens.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

New York-based fitness startup Peloton has raised $10.5 million

image

New York-based fitness startup Peloton has raised $10.5 million in a Series B round led by Tiger Global Management with participation from angel investors.

Peloton sells tablet-equipped, interactive exercise bikes online and through three stores in the New York metropolitan area.

Founded in 2012, Peloton has raised more than $14 million to date and will use the new funds to ramp up production and open its first indoor cycling studio.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

San Francisco-based sales productivity platform Stitch has raised $3.25 million

image

San Francisco-based sales productivity platform Stitch has raised $3.25 million in seed funding from Google VenturesSoftTech VC, Freestyle CapitalFoundation CapitalENIAC Ventures and a handful of angels.

Stitch is a mobile sales productivity platform that provides salespeople with data and information to improve response times and close more deals.

Founded last year, Stitch is currently in private beta.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

San Francisco-based sales productivity platform Stitch has raised $3.25 million

image

San Francisco-based sales productivity platform Stitch has raised $3.25 million in seed funding from Google VenturesSoftTech VC, Freestyle CapitalFoundation CapitalENIAC Ventures and a handful of angels.

Stitch is a mobile sales productivity platform that provides salespeople with data and information to improve response times and close more deals.

Founded last year, Stitch is currently in private beta.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

San Francisco-based mobile shopping app Wish has raised a $19 million

image

San Francisco-based mobile shopping app Wish has raised a $19 million round led by GGV Capital and Formation 8 with participation from Jerry Yang and existing seed investors.

Wish allows users to view a personalized feed of goods and create lists, and provides shopkeepers with a platform to run highly targeted offers.

Founded in 2011, Wish currently has 25 million users in 50 countries and will use the new cash to fund international expansion.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

New York-based key management startup KeyMe has raised $7.8 million

cropped-global_technology_vector_3_1807601.jpg

New York-based key management startup KeyMe has raised $7.8 million in Series A funding from White Star Capital, Battery Ventures, 7-Ventures LLC, and Ravin Gandhi.

Through KeyMe’s mobile app, users can scan keys with their smartphone to make duplicates on the spot or order them by mail.

Launched in 2013, KeyMe has raised $10 million to date and will use the new funding to add hundreds of kiosks in local stores across the country within the next year and grow its 13-person team.

 

 


By: Jarrett Neil Ridlinghafer
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Boston-based insurance startup Consumer United has raised $14 million

cropped-vector-gears-engineering-technology-mechanical-background-full-scalable-vector-graphic-included-eps-v8-and-300-dpi-jpg-3807601.jpg

 

 

Boston-based insurance startup Consumer United has raised $14 million in new funding co-led by Spark Capital and Thayer Street Partners with participation from Village Ventures and Five Elms. Consumer United provides online tools for consumers to compare rates on auto and home insurance for major insurance providers. Founded in 2007, Consumer United currently serves customers in 38 states and has raised a total of $70 million in funding to date.

 

 


By: Jarrett Neil Ridlinghafer
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

New York-based online invitations startup Paperless Post has raised $25 million

cropped-technology-background-design-vector-graphics-011.jpg

 

New York-based online invitations startup Paperless Post has raised a $25 million Series C round led by August Capital with participation from existing investors RRE Ventures, SV Angel, Tim Draper, Ram Shriram, and Mousse Partners. Paperless Post lets users create custom invitations online and send them to their friends via email or offline with its new printing service PAPER. Launched in 2008, Paperless Post currently has over 45 million users and will use the new funding to accelerate mobile and online development.


By: Jarrett Neil Ridlinghafer
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

New iOS malware highlights threat to Apple mobile devices

cropped-security-compliance-011.jpg

 

A newly-discovered malware dubbed  Unflod Baby Panda is stealing Apple ID credentials from jailbroken iPhones and iPads, warn security researchers.

Unflod hooks into the SSLWrite function of an infected device’s security framework, according to a blog post by German security firm SektionEins.

The malware is designed to listen for outgoing connections. Once it recognises an Apple ID and password, it sends these unencrypted IDs and passwords to the cyber criminals behind the malware.

The Unflod malware also highlights the risks of installing unknown apps on jailbroken iPhones.

Reports of the malware targeting Apple iOS emerged in posts on reddit by iOS users hit by repeated system crashes after installing iOS customisations that were not part of the official Cydia market.

A developer for the Cydia market, an alternative to the Apple App Store, has responded to news by in a reddit comment, saying that the probability of Unflod coming from a default Cydia repository is fairly low.

However, he added: “I don’t recommend people go adding random URLs to Cydia and downloading random software from untrusted people any more than I recommend opening the .exe files you receive by email on your desktop computer”.

The origin and source of the malware is still unknown, which means no one can yet say which software package from what unofficial repository is likely to initiate an infection, according to security firm Sophos.

The infected file relies on add-on functionality, commonly available on jailbroken devices, known as Cydia Substrate or Mobile Substrate, the firm’s Paul Ducklin wrote in a blog post.

This “substrate” allows users to extend and modify the behaviour of iOS in ways that are deliberately prohibited by Apple on devices that have not been jailbroken.

However, Ducklin said the threat is limited because the malware can affect only jailbroken devices and SophosLabs has not had any report of “in the wild” infections.

“If you haven’t jailbroken your iOS device, you don’t need to worry.

“If you are a jailbreaker and you have been circumspect in what you choose to install, you probably don’t need to worry,” Ducklin wrote.

The malicious code only works only on 32-bit versions of jailbroken iOS devices, according to SektionEins.

There is no ARM 64-bit version of the code, which means the malware should never be successful on the iPhone 5S, iPad Air or iPad mini 2G, the firm told ArsTechnica.

SektionEins recommends that anyone affected by the malware should restore the device and change their Apple ID and password as soon as possible.


By: Jarrett Neil Ridlinghafer
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Dominoes Now Accepting Google Wallet!

image

Domino’s Pizza (NYSE:DPZ) has integrated Google Wallet (NASDAQ:GOOG) with its ordering app, and Android customers can now pay for their online orders using the service.

Google Wallet lets users store debit cards, credit cards and loyalty cards electronically on their mobile phone. Domino’s customers who place an online order of $10 or more using the Android ordering app and pay by selecting the “Buy with Google” button at checkout will receive a free order of Domino’s new Specialty Chicken, now through June 15.

“This is yet another way Domino’s is using technology to improve our customer experience,” said Patrick Doyle, Domino’s Pizza president and chief executive officer. “Google Wallet is a great technology that allows customers even more flexibility and convenience when it comes to paying for their Domino’s orders.”

Domino’s mobile app now accounts for approximately 40 percent of U.S. orders and last year, the company reported $3 billion in digital sales, globally. Investing in, and promoting, mobile ordering is a priority for the chain. Domino’s encourages customers to create profiles and store payment information in order to speed up the process. Domino’s customers can now order a pizza in roughly 30 seconds.

The faster customers can order, the more they will order, according to Doyle.

Adding Google Wallet as a payment option is just another weapon in this arsenal. The company has been focused on technological innovation of late, approaching new initiatives with the mindset of a startup. “We will continue to come up with every way possible to conveniently order from Domino’s, and use technology to offer the best customer experience possible,” added Doyle. “This is just the latest step, and we are very excited to roll this out later this year.”

Domino’s is just one of the many quick-serve restaurant chains to adopt mobile payments. Burger King (NYSE:BKW) and Wendy’s have both recently launched mobile payment apps, and the drive to accept mobile payments continues at a break neck speed.

For more:
-See this FierceRetail story
-See this endgadget story

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Gap to make $300 million digital investment

image

Gap (NYSE:GPS) is making a $300 million investment in digital as the retailer builds out omnichannel capabilities and focuses on responsive design.

Executives talked up technology, innovation and scale as competitive advantages during the retailer’s annual shareholder meeting in San Francisco last week, noting the chain is making progress toward bridging digital with its physical stores.

“We have the world’s best collection of American brands coupled with a strong economic model and runway for global growth,” said Glenn Murphy, chairman and CEO. “As the retail landscape evolves, we continue to deliver on our omnichannel roadmap and focus on owning the shopping experience of the future.”

New initiatives include offering reserve-in-store, find-in-store and ship-from-store options to be tested later this year. The reserve-in-store service began as a test in June 2013 and will be expanded to all Gap stores in the United States by the end of the second quarter, thus enabling online and mobile shoppers to reserve and pick up items at more than 1,000 Gap and Banana Republic store locations.

Mobile is a big focus for Gap going forward. “The opportunity to better monetize the huge amount of incremental traffic coming off of this device we see as very significant, and you will see pretty radical progress in our mobile Web experience, and the experience delivered to this device over the course of the next several months,” Art Peck, president of growth, innovation and digital, said during the meeting, according to Mobile Commerce Daily.

Much of the investment will go toward developing websites with responsive design. “Responsive design (is) a big buzzword in the industry,” Peck said. “We are rolling out responsive design as we speak.” The redesigned sites will feature an improved check-out experience and better browsing, features that Peck said will enhance Gap’s ability to drive m-commerce.

There will also be a test of mobile point-of-sale technology that will be part of a new loyalty program that the retailer plans to test later this year.

Gap has been playing catch-up in the digital space. While many of its competitors have long offered in-store ordering for out-of stock items, Gap is just now talking about implementing endless aisle inventory.

For more:
-See this Gap press release
-See this Mobile Commerce Daily story

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Google Owns Four of the Seven Online Businesses with more than One Billion Active Monthly Subscribers

google_one_billion_subscriber_sites

 

 

 


By: Jarrett Neil Ridlinghafer
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

AOL Mail Hacked, Accounts Sending Spam

Cyber-War

 

If you get a suspicious email from an AOL user, it’s probably best to delete it. The service has apparently been compromised and some accounts are sending out spammy messages.

But rather than compromising actual accounts, it appears the scammers are just spoofing them. As AOL explained in a help page, “spoofing is when a spammer sends out emails using your email address in the From: field. The idea is to make it seem like the message is from you – in order to trick people into opening it.”

“These emails do not originate from AOL and do not have any contact with the AOL Mail system – their addresses are just edited to make them appear that way,” the company said. “The message actually originates from the spammer’s email account and is sent from the spammer’s email server.”

The easiest way to tell if you’ve been affected is if your inbox is littered with message bounce backs from emails you never sent. Or perhaps a friend or two has been kind enough to alert you to the spam messages your account appears to be sending. To determine if you’ve been hacked versus spoofed, check you sent messages: if there are sent emails you didn’t send, it’s a hack. If there’s nothing there, it’s a spoof.

AOL is urging users to change their passwords and be on the lookout for sketchy emails so they don’t fall prey to phishing scams.

“AOL takes the safety and security of consumers very seriously, and we are actively addressing consumer complaints,” the company said in a statement. “We are working to resolve the issue of account spoofing to keep users and their respective accounts running smoothly and securely.Users can find the latest updates on our AOL Help site, and should contact us if they believe their account is being spoofed.”

It appears the problem has been going on for about a week. AOL’s @aolmailhelp Twitter account has been responding to complaints from users since at least April 15, most of which direct users to the help page.

UPDATE: AOL on Monday said it would change its email policies to avoid delivery of spoofed messages. “AOL Mail is immediately changing its policy to help mail providers reject email messages that are sent using forged AOL Mail addresses,” the company said. “By initiating this change, AOL Mail, along with other major email providers will reject these spoofed email messages, rather than deliver them to the recipient’s inboxes.” More details are on its blog.

 

 

 


By: Jarrett Neil Ridlinghafer
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Google refunds bogus security app buyers

image

Google has announced it will refund customers who paid for a bogus security app from its Google Play Android app store.

Those duped by the fake app will also be given $5 (£2.97) credit to spend in the online app store, reports the BBC.

The Virus Shield app was ranked as top selling new paid app in the Google Play Store within a week of its publication in April and reached over 10,000 downloads before it was removed.

The app was exposed as a fake by the Android Police news site after its investigators found that the app’s code contained no security functionality at all.

The app, priced at £2.35, claimed to prevent harmful apps, protect personal information and scan app settings, files and media in real time.

Google removed the app from Google Play after the Android Police site found that all the app did was change its logo from an “X” image to a “check” image and nothing more.

‘Foolish mistake’

Despite updating the app several times, the developer of Virus Shield told the Guardian that the app was a “foolish mistake” and was mistakenly uploaded with theanti-virus code missing.

Google said it would refund anyone who bought the app because Google Play’s policies “strictly prohibit” false claims, which in this case are believed to have netted the developer more than £20,000.

Virus Shield has prompted calls stricter controls of the content made available in the Google Play store such as an automatic review of all content that reaches the top ten seller rankings.

Control over apps

“Unfortunately the wide-open nature of the Play Store means that unscrupulous people can take advantage of it,” the Android Police site observed.

But comments in user forums indicate it would be difficult to find consensus on what types of controls would be sufficient, with many Android users rejecting the tight controls imposed by Apple.

Fake security products scams have emerged as a top money spinner for cyber criminals in recent years but – unlike Virus Shield – most of these bogus security products include malicious code.

In July 2013, IT services firm Symantec found that Google Play was “riddled” with malicious apps, despite efforts to keep it clean.

A test search carried out by Symantec using Google Play search resulted in 21 out of 24 top hits being malicious apps.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

With Heartbleed, IT leaders are missing the point

image

 Evan Schuman, Computerworld |  Security

April 17, 2014, 2:56 PM — The IT response to Heartbleed is almost as scary as the hole itself. Patching it, installing new certificates and then changing all passwords is fine as far as it goes, but a critical follow-up step is missing. We have to fundamentally rethink how the security of mission-critical software is handled.

Viewed properly, Heartbleed is a gift to IT: an urgent wake-up call to fundamental problems with how Internet security is addressed. If the call is heeded, we could see major improvements. If the flaw is just patched and then ignored, we’re doomed. (I think we’ve all been doomed for years, but now I have more proof.)

Let’s start with how Heartbleed happened. It was apparently created accidentally two years ago by German software developer Robin Seggelmann. In an interview with the Sydney Morning Herald, Seggelmann said, “I was working on improving OpenSSL and submitted numerous bug fixes and added new features. In one of the new features, unfortunately, I missed validating a variable containing a length.”

After Seggelmann submitted the code, a reviewer “apparently also didn’t notice the missing validation, so the error made its way from the development branch into the released version.” Seggelmann said the error was “quite trivial,” even though its effect wasn’t. “It was a simple programming error in a new feature, which unfortunately occurred in a security-relevant area.”

What Seggelmann did was fully understandable and forgivable. The massive planet-destroying problem is that our safety mechanisms for simple math errors are all but nonexistent. If our checks and balances are so fragile that a typo can obliterate all meaningful security, we have some fundamental things to fix. Let’s not forget that whenRobert Tappan Morris unleashed the Internet Worm back in 1988 — the first major instance of the Internet crashing due to a worm — it was also the result of a math error. He never intended to cause servers to crash, but crash they did.

David Schoenberger, CIO of security vendor Transcertain, argues that the real fundamental security flaw at play here is, bizarrely enough, an overabundance of trust exhibited by IT security folk. Personally, when I think of the best IT security specialists I’ve worked with over the years, having too much trust is not the first thought that comes to mind. But Schoenberger makes a good point.

“This is going to make people rethink what we’re doing. There are so many things overlooked, taken for granted. In the IT world, we’ve relied on the trust factor for so long,” he said. “Just look at these billion-dollar companies who are relying on peer-reviewedopen source. We’re not taking the time to prove it [is secure] ourselves. Because something mostly works and, as far as perception goes, it works well, it passes all our tests. It sucks the way testing is occurring right now with open source. But I won’t even limit it to open source, as this could have happened to a commercial provider. Could have happened to anyone.”

Fair and legitimate point, but is there a practical and better way? It’s not akin to a company testing its own applications (although if we take mobile apps as a hint, we’re not exactly getting an A+ there, either.

Microsoft has been legendary in its crowd-sourcing strategy: An initial software cut is released to millions, and they find the holes. This gave rise to my favorite Microsoft quip, many years old and unattributable at this point, unfortunately: “Here at Microsoft, quality is Job 1.1.” The crazy thing is that it generally worked. How did Heartbleed spend two years in full circulation before any security researcher noticed this error?

Some are convinced that the hole must have been noticed by someone. The National Security Agency has been accused of knowing of this hole and exploiting it. The accusation led to the NSA issuing what may be the least credible denial in quite some time: “Reports that NSA or any other part of the government were aware of the so-called Heartbleed vulnerability before April 2014 are wrong,” the statement from the U.S. Office of the director of National Intelligence said. “The Federal government was not aware of the recently identified vulnerability in OpenSSL until it was made public in a private sector cybersecurity report.”

There are two parts of the full statement ( read it here) where credibility leaches out. First, between the CIA, the FBI, the NSA, the military and let’s say 200 other government operations, it’s ludicrous to declare that nobody knew about something. How do you know that one Army security specialist didn’t know? Not every geeky hole that is discovered is necessarily included in a memo to senior management. Had they said “to the best of our knowledge” or “we can’t confirm that anyone here knew about it,” that would at least be plausible. It’s like my teenager telling me that nobody in her high school uses drugs or drinks. It doesn’t pass the laugh test because there is no way she could know such information definitively.

The second concern with the NSA statement is the very last line: “Unless there is a clear national security or law enforcement need, this process is biased toward responsibly disclosing such vulnerabilities.” Nothing in the statement says that no such need was found in this case. This is akin to my daughter following up her no-drugs testimony by saying, “I will always tell you the absolute truth about such things, unless I conclude that it would cause problems for my friends, in which case I would lie.”

I’m generally no fan of adding bureaucracy, but it might be time to create formal review procedures — ideally, with multiple layers — with people actively and openly looking for holes. Peer review is great, but for anything as mission-critical as Internet security, we are way past the time to proactively seek out such holes, rather than hoping we stumble upon them.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting Internationa

Personal Cloud Market Expected to Reach $43.5 Billion by 2018

cloud8

 

(PRWEB) April 20, 2014

The report “Personal Cloud Market – Global Advancements, Business Models, Technology Roadmap, Forecasts & Analysis – 2018”, defines and segments the global personal cloud market with analysis and forecasting of the global revenues. It also identifies drivers and restraints for personal cloud market with insights on trends, opportunities, and challenges. In addition to this, the report also offers business case analyses, models, and Go to Market (GTM) and pricing Strategies.

Browse 150+ market data table/figures spread through 176 pages and in-depth TOC on “Personal Cloud Market”.
http://www.marketsandmarkets.com/Market-Reports/personal-cloud-market-821.html


By: Jarrett Neil Ridlinghafer
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Milbank: Getting lost and disconnected in cloud of technology

image

By Dana Milbank, Columbian Syndicated Columnist

WASHINGTON — I missed my brother’s anniversary last week. But I have a very 21st-century excuse: I lost it in the cloud. I discovered this the next day when my sister-in-law mentioned they had been out to dinner to celebrate their 15th.

I thumbed to April 11 in my phone’s calendar, where I’m certain I had entered the occasion years ago: Nothing. I went to July and checked both of their birthdays: Gone. My 13-year-old nephew’s birthday had vanished but, oddly, my 10-year-old nephew’s birthday remained.

For years, I had been diligent about recording special occasions in my electronic calendars and sending out greetings to friends and family. I don’t know exactly how, but at some point in the last year it all went haywire.

I sent birthday wishes to my friend Mark last year on May 8. “I truly appreciate the sentiments,” he replied. “My birthday was April 7.”

Other friends’ birthdays suddenly were listed on two subsequent days, as if they were 48-hour celebrations. Then there was my friend Steve. His birthday in my calendar is listed as March 18, and also March 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30 and 31.

Poor Steve is aging twice as fast as my dog.

The cause of this may well be user error. Somehow, in merging my PalmPilot and BlackBerry calendars with Lotus Notes, Gmail, Outlook and my various iPhones and iOS versions, one device must have overridden all the others, omitting crucial dates. I’d ask The Washington Post’s IT department to help, but it has enough to do keeping the newspaper publishing without retrieving my brother’s anniversary.

While my calendar entries disappear, somebody or something — Apple? Google? Microsoft? — is putting unwanted dates in my calendar. Does Outlook suppose I wouldn’t know April 15 is tax day, or that New Year’s Day is Jan. 1?

This is, of course, just one of many ways in which the very technology that is supposed to connect us leaves us more disconnected. For me, the bigger problem may be that I rely on technology so much that it lets me forget the most basic things. Why should I make space in my brain for my brother’s anniversary if my phone does it for me? I remember few phone numbers — not even my own. My home number starts with 244, I’m pretty sure, but for the rest I have to look myself up in my contacts.

My iPhone calculator has replaced arithmetic for me, and Microsoft Word now catches me making not just spelling errors but mistakes in syntax. I’ve been second-guessing my directional sense with Google Maps — even to confirm whether I’ve found the most efficient route to my daughter’s school.
Smarter than Google

The good news for human brains: Google’s directions are often dumb compared to my own. None of its algorithms or live traffic data can tell you how bad an idea it is to take southbound 36th Street NW to Connecticut Avenue during morning rush hour, as Google wants me to do. It’ll be lunchtime before you get through that intersection.

Perhaps I should apply a similar human touch to the lost birthday problem. Even if my electronic calendar accurately prompts me to dash off a one-line email, I haven’t had any meaningful contact with the birthday celebrant.

My friend Mark, whose birthday I observed a month and a day late, probably didn’t much care. He and I and other friends take turns hosting an annual reunion. We hug, eat and drink, take walks and have long and intimate conversations about our lives. These gatherings are some of the happiest days of the year, and technology will never duplicate them.

My brother and his family are visiting me this week for Passover, the Jewish festival celebrating spring and renewal. These days together are far more important than the anniversary email.

Still, if anybody has a working e-calendar, would you remind me next April 11 that it’s my brother’s anniversary? Just call me at my home number, 244- . . . oh, never mind.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting Internationa

German Scientists 3D Print Lightweight Material Stronger than Steel

image

Innovation in the 3D printing industry is coming from all sides now, with some of the most important developments occurring in material science. The more materials we can print, the more useful the technology becomes. What many users of the technology have discovered, though, is that 3D printing allows for the creation of unique geometries that increase the usefulness of certain materials.  By printing objects with specific shapes, we can induce important physical properties in our prints.

When it comes to the strength-to-weight ratio of a material, this couldn’t be more important as researchers constantly work to make strong, yet lightweight composites for fields like the aerospace industry.  This is where scientists at the Karlsruhe Institute of Technology in Germany come in. Using the Nanoscribe laser lithography printer, which we’ve covered on the site before, PhD student Jens Brauer and his colleagues developed a process for 3D printing microscopic structures that are less dense than water, but stronger than steel.  Bauer tells The Conversation that, “This is the first experimental proof that such materials can exist.”

ashby chart 3D Printing

image

If you look at the Ashby chart above, you’ll see that most materials that can withstand pressures of up to 280 MPa (megaPascals often used to measure stiffness or tensile strength) are metal alloys that tend closer to the denser side of the weight spectrum.  There are a couple of exceptions, however.  Wood and bone[1] tend to have high tensile strength while maintaining a very light weight, thanks to their porous internal structures.  Taking a cue from these natural materials, the team went about designing 3D structures that might exhibit similar properties.

image

3D Printed shapes

Bauer and his colleagues 3D printed microscopic trusses out of a ceramic material in the shapes you see above.  In order to further increase the stiffness of these shapes, a coating of aluminum oxide was applied using atomic layer deposition (ALD).  The strongest structure the team was able to create was both lighter than 1000kg/m3 and was able to withstand pressures of up to 280 MPa, with its honeycomb shape and an alumina coating 50 nanometers thick.  This makes the material both lighter than water and stronger than some forms of steel, The Conversation points out.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting Internationa

Changing Your Password Won’t Stop Heartbleed! The most Prolific and Insidious Internet Threat in History Explained

image

I’ve read many stories and even received many emails by service providers telling me

“our systems have not been affected by this “Heartbleed” bug you may have heard about…however to be sure your information is safe we are asking all customers to change their password”

This is the stupidest statement I’ve ever read, and it’s a blatant lie… These companies are either ignorant themselves or just trying to placate you, changing your password does absolutely NOTHING TO PREVENT THIS WORST HACK SINCE THE INTERNET BEGAN…..

You see, the bug is part of the Internet Infrastructure at this time, it’s a Security software called OpenSSL which an estimated 95% of all websites and other software and Web applications use for almost EVERYTHING!  Which is why it’s been said it’s the most serious incident in the history of the Internet….

Most people have no idea just how serious this is, but I will try to show you and explain it in terms you can understand.

Consider every Bank, Every Stock Trading Service, Every hosted website, every login script, Every Online store, every email server, every security tool, every security protocol, every browser, every SSL certificate, Every payment processing service, every software application currently being used…. they all use or have embedded into their code or support OpenSSL.

The Irony

OpenSSL was software which was  developed many years ago to keep you and everyone else safe from hackers and predators intent on hacking into the banks, websites, stores, payment gateways….

The Problem

Someone very smart, recently looked through the code which was written to create OpenSSL, and they discovered a basic software development mistake that opens OpenSSL up to even kiddie script hackers now like this 19 year old in Canada who just used the discovered vulnerability to steal 900 Canadian social security numbers…. From the government, that should tell you just how insidious this discovery really is…

The ONLY Cure
So THE ONLY THING THAT WILL STOP THESE HACKS is REPLACING OpenSSL (which is free open source software) with something else, there are numerous alternative security protocols however it IS NOT A SIMPLE SOLUTION!  which is why a recent analysis stated that 70% of the Internet was still vulnerable to this “bug”

The Only way to stay safe until it’s completely replaced throughout the Internet is to REMOVE YOUR DATA AND CLOSE YOUR ACCOUNTS

Obviously the businesses are trying to stop mass hysteria by sending these misleading emails asking people to “Change passwords” when they know full well that won’t do anything. They also will most likely tell you that THEY were not affected….. Hogwash…. If their on the Internet, the odds are they are using it and vulnerable and are scrambling to replace it everywhere… The problem is it is so integrated into everything which is why it’s being called
The Biggest Internet Threat in the History of the Internet

So, don’t be fooled, no company unless their smart, is going to be honest about how big of an impact and how vulnerable they’ve been made because of this discovery, so use common sense and Protect Yourself… And right now there’s only one way to do that… Don’t use the Internet for secure transactions or place your critical data anywhere on the Internet even behind a firewall or VPN tunnel, it’s still not safe (or could still be not safe, depending on what is being used, but why risk it?

Pull everything, wait 6 months then when you feel confident, before selecting a new provider have them show you their OpenSSL replacement plan and how they have resolved the issue, if that can’t provide that immediately then go somewhere else…

Red Hat Touts Virtual Containers as Next Advance in Cloud Computing

image

By Eric Lundquist
NEWS ANALYSIS: Red Hat says virtual containers will be an efficient way to distribute applications across the hybrid cloud infrastructures that are favored by enterprises.

The development of shipping containers enabled the global trade we now enjoy and made shipping your products around the world a simple transaction. As the opening paragraph in a book chronicling the rise of shipping containers (The Box) from the origins in 1956 onward stated, “The container made shipping cheap, and by doing so changed the shape of the world economy.”

Now containers, albeit in a digital form, are being touted as the next big thing in cloud computing.

At the Red Hat Conference April 14-17 in San Francisco (which I did not attend but did watch the keynotes on YouTube), containers were a big piece of the news. “This [container technology] is one of the things that is really going to drive the future,” said Red Hat President of Products and Technologies Paul Cormier in his keynote address.

In an era where applications will increasingly be sharing physical and virtual machines as well as private and public clouds, container technology—which allows for lightweight application transport between those myriad platforms—may be the only way to go.

The idea of virtual containers has been around for a while. Although I’m not sure if they go all the way back to those 1956 physical containers, Sun was talking about Solaris containers in the early 2000s.

On one hand, you can see containers as the continuing evolution of applications being separated from hardware. Server virtualization allowed for the operating system to be freed from a specific hardware server. Now containers promise the ability to disperse and move applications throughout the hybrid infrastructures, which seem to be the favorite of corporate techies.

Containers as the next evolution of virtualization is a favorite topic of, no surprise, the companies created around the container concept. In a blog post last September, Pantheon CEO Zack Rosen, wrote, “We believe that the future cloud will run on containers, not virtual machines. Pantheon’s container based infrastructure is a huge departure from traditional virtual machine and server based ‘hosting’ model.”

Pantheon runs more than 55,000 Drupal content management system sites, which Rosen claimed would require 100,000 virtual servers without using the lighter, more flexible container approach.

One other big name in the container field is Docker, which describes itself as a shipping container for code. The Docker approach uses the Linux platform to allow containers to be isolated but share an operating system and libraries.

It was the sharing capability and the relative infancy of containers that initially raised concerns about security, privacy and compliance in the corporate environment. Those same concerns were the same for public cloud computing and virtualization.

But as those concerns have been overcome in the public cloud environment, they are also being overcome in the container model. Oracle, the epitome of the enterprise software vendors, has been touting its pluggable 12c database based on the container model.

At its customer conference on April 15, Red Hat introduced its Project Atomic designed to create a user community to develop technologies to further the development of lightweight Linux container hosts. The company is also aligning with Docker to provide the container format used in its OpenShift platform for service development.

Red Hat isn’t alone in talking up the container trend as the next evolution of virtualization. With corporate techies still wary of being locked into proprietary development platforms—whether on premises or offered up as a platform as a service—the container concept addresses those concerns as well as the difficulty of managing applications that require a full virtual machine to be deployed and redeployed depending on their hosting location.

Just when you thought you might be getting tired of talking about IaaS, PaaS and SaaS, now you can start shifting your cloud conversation to CaaS—containers as a service.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

DOBSON WINS TEMPORARY OBAMACARE EXEMPTION

image

“Christian radio broadcaster James Dobson has won a temporary injunction preventing the federal government from requiring his ministry to include the morning-after pill and other emergency contraception in its health insurance. A federal judge in Denver issued the injunction Thursday.

Dobson sued in December, saying the Affordable Care Act mandate to provide the contraception violates the religious beliefs of his Colorado Springs-based ministry, called Family Talk.

The U.S. Supreme Court is considering similar challenges from Hobby Lobby and other employers. Dobson is founder and president of Family Talk, which has a nationally syndicated radio show, newsletter and website. The lawsuit says the ministry has 28 full-time employees.

He’s best known as founder of the conservative Focus on the Family ministry. He left that group and launched Family Talk in 2010.”

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

San Mateo-based marketing performance management startup Beckon has raised $8 million

image

San Mateo-based marketing performance management startup Beckon has raised $8 million in Series A funding from August Capital and Canaan Partners.

Beckon helps marketers track spending using data analytics by consolidating all data and insights from different sources.

Founded in 2011, Beckon has raised $10 million to date and will use the new funds to scale its sales and marketing teams.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

3D Printing: How MakerBot, Thingiverse and OpenSCAD want to change every-Thing

image

Here on 3DPI Bre Pettis, MakerBot’s CEO, does not really need an introduction because, for one thing we’ve interviewed him before and for another he’s probably the most famous 3D printing personality in the industry, and certainly has the most recognisable face. He is also one of the great advocates for this technology sector and represents one of the most dominant 3D printing brands. As one of the original founders of MakerBot, the leading consumer / desktop-based (or, as I like to call them, personal) 3D printers, which has, in turn, become the most important financial bet for Stratasys he is a busy man. The world’s second largest industrial 3D printer producer bet $400 million that Pettis is right, that MakerBot machines are going to change the face of personal manufacturing and the way the entire world “thinks about things.” We sure hope so, because we had a chance to meet him during Milan Design Week, where he came to see how 3D printing is changing the world of design. He came to visit us during our Synthesis exhibit and I had the chance to ask him about his plans in the near future for MakerBot and Thingiverse, which is rapidly shaping up to be a multi-million monthly user social network.

Davide Sher: How are the new MakerBot Replicators coming along?

Bre Pettis: We just launched our fifth generation Replicator for the prosumer area but this time we are focusing more on the consumer, with the Replicator Mini, and trying to target the higher end industrial side with the Z18. I think design professionals will appreciate the new Replicator, a machine they can just keep on their desk, while consumers will like the possibility of carrying around the Mini with relative ease. The Z18 offers serious capabilities at an accessible price point so I think the game is on.

Davide Sher: MakerBot just signed a new agreement for its first official distributor in Italy, Energy Group. What is going to change?

Bre Pettis: What it means is that now there will be someone local that will help both us and our customers. For an American company it is sometime difficult to do business overseas, as people order a product but are not fully aware of import duties that affect the final prices. Having an official distributor on the territory will streamline the entire process and give us support in terms of customer and technical assistance.

MakerBot Bre Pettis 3D PrintingDavide Sher:  Is it correct to say that you now represent Stratasys’ consumer business?

Bre Pettis: Yes in a way it is correct and it is a little bit ironic: we started MakerBot in 2007 because we could not afford Stratasys’ 3D printers, and now we are part of the family.

Davide Sher:  Does MakerBot ever intend to explore other technologies along with FDM?

Bre Pettis: Technologies such as stereolithograpy (SL) and laser sintering make sense on the industrial side but our goal is to produce machines that consumers can use right in their living room, that can sit on their desks and do not release odours and chemicals. SL is an awesome technology and the XFab by DWS [DS: the new prosumer SL 3D printer we were presenting during the Synthesis event] is a hot machine but we want machines that are fully accessible, even to kids. There are going to be a lot of different 3D printers on the market and there is room for everyone. What I love to see is people and companies inventing new things, not just imitate and follow.

Davide Sher:  Let’s move on to Thingiverse. Is it really going to be the “Thingiverse of the world” and how do you intend to organize its growing amount of content?

Bre Pettis: There are definitely some issues we need to solve. We started Thingiverse as a place where people could share their digital designs of things. We knew people could download music, television, books… and we wanted people to download things. In the beginning we had a laser cutter: we mostly published laser cut designs and everyone who participated was incredibly creative. Now it is different: we have millions of viewers every month and tens of thousands of designs every month,  so there are a few things that happened.

For example, recently it was the first month that there were more things collected than created. So there were more people that organized things rather than users creating new things. People are contributing to organize the website’s database and that is a positive development.

One of the biggest challenges is – I must admit – that I am starting to fail at going through every single new upload. Up until a year ago I would look at every new thing on Thingiverse every day. It would take me about 20-40 minutes but now there are so many new things uploaded everyday that often I just do not have that extra hour I would need to look at everything.

So we have implemented certain algorithms that allow the cooler objects to show up more but it is a bit like Twitter. In the beginning you could “hear” everyone on it. Now it is just too huge.

Davide Sher:  Are you going to further develop the social aspect of it?

Bre Pettis: We have to. Otherwise you will just not be able to see what is cool. We have to help each other find the best objects to print or it will be like drinking straight from the hose… too much guzzling. Our biggest challenge is to make it so that everyone can find the things that interest him or her. We want to implement features that will allow users to follow specific designers and have their creations show up automatically in their feed. Also we want to add features for following and contributing to discussions. These are awesome challenges and we love to work on them.

Davide Sher:  How many people work on Thingiverse exactly?

Bre Pettis: Developers can do amazing things but finding good web developers is not easy. Right now the team is in the order of a few tens of people. Everyne works on everything: the team collaborates on the design, on the apps. Everyone involved is a Thingiverse power user so when we launch something we try to make it the best it can possibly be.

I would love to be able to grow the development team into the hundreds, but Thingiverse is tricky because we do not make any money on it, we just spend on it and what we get in return is a lot of excitement because there are literally tons of objects being downloaded all the time. For us it is a “labour of love”.

Davide Sher: That brings us to the “big” question. Is there an ethic to what can be uploaded or do you just try to give as much freedom as possible?

Bre Pettis: We want Thingiverse to survive so from day one we were DMA compliant. Basically we work like YouTube: if somebody complains we take it down. There is a very specific way in which they have to formulate their request and we only take it down if the precise legal format is followed. At this point the original person who uploaded it can do the same to claim ownership of the design and ask for it to be reposted. So we have to put it back online and after that any further request has to be handled by a court. It is a very standard procedure.

Davide Sher: What do you think of OpenSCAD as a 3D modelling software?

Bre Pettis: The “father” of OpenSCAD is Marius Kintel, a Scandinavian who lived in Austria and is now in Toronto. He is one of the smartest people in the world and a wonderful person. When he took over the OpenSCAD project he rewrote every single line to optimize it and we customized Thingiverse on top of it. In fact we are great friends: he worked with me on building my first 3D printer. Now as MakerBot we are supporting OpenSCAD development: for programmers and web developers it is the ultimate 3D modelling tool – open solid CAD modelling – and adding customizers on top of it it makes it so that you can program a Christmas tree and give amateur end users the tools to change it and personalize it through sliders. The relationship between us and OpenSCAD is a friendship that is going to last a long time.

Davide Sher: Thanks so much for talking with us Bre, we appreciate your time.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

San Francisco-based primary care medical practice One Medical Group has raised $40 million

image

San Francisco-based primary care medical practice One Medical Group has raised $40 million in growth capital led by Redmile Group with participation from previous investors which include BenchmarkDAG Ventures, Maverick CapitalOak Investment Partners, and Google Ventures.

One Medical is using technology to build a better primary care delivery program, allowing patients to schedule appointments, request prescriptions and lab results, and see a personal health summary online.

Founded in 2007, One Medical has 27 locations nationwide and will put the new funds toward expansion to new cities and mobile product development.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

19-Year-Old Canadian Arrested for using Heartbleed Hack to steal Social Security numbers from 900 Canadians

image

19-year-old Canadian was arrested on Tuesday for his alleged role in the breach of the Canada Revenue Agency (CRA) website, the first known arrest for exploiting the Heartbleed bug.

Stephen Arthuro Solis-Reyes (pictured) of London, Ontario faces one count of Unauthorized Use of Computer and one count of Mischief in Relation to Data.

image

On Monday, CRA Commissioner Andrew Treusch announced that over the course of six hours, the Social Insurance Numbers of about 900 taxpayers were removed from CRA systems. The hack occurred only a day after CRA services were fully restored, following last week’s temporary shutdown due to the Heartbleed bug.

“The RCMP treated this breach of security as a high priority case and mobilized the necessary resources to resolve the matter as quickly as possible,” Assistant Commissioner Gilles Michaud said in a statement.

A search of the suspect’s home led to the seizure of computer equipment. Police provided no further details about the ongoing investigation.

Solis-Reyes is scheduled to appear in court in Ottawa on July 17.

Uncovered early last week by a team of researchers from Google Security and Codenomicon, the Heartbleed weakness has been roaming the Internet for two years, leaving the door to encrypted data and personal information wide open to scammers.

Now, Web-based organizations are scrambling to patch their systems before they become the next Canada Revenue Agency.

Those 900 residents whose data was compromised can expect a registered letter informing them that they’ve been impacted; for added security, the agency will not be making phone calls or sending emails.

It will, however, provide the affected users with free access to credit protection services and will apply additional protections to their CRA accounts to prevent future unauthorized activity.

For more, see PCMag’s Heartbleed: The Complete Rundown. Also check out Heartbleed: How It Works and Heartbleed Bug: Should You Panic?

Also watch PCMag Live in the video below, which discusses the arrest of the 19-year-old hacker.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

19-Year-Old Canadian Arrested for using Heartbleed Hack to steal Social Security numbers from 900 Canadians

image

19-year-old Canadian was arrested on Tuesday for his alleged role in the breach of the Canada Revenue Agency (CRA) website, the first known arrest for exploiting the Heartbleed bug.

Stephen Arthuro Solis-Reyes (pictured) of London, Ontario faces one count of Unauthorized Use of Computer and one count of Mischief in Relation to Data.

image

On Monday, CRA Commissioner Andrew Treusch announced that over the course of six hours, the Social Insurance Numbers of about 900 taxpayers were removed from CRA systems. The hack occurred only a day after CRA services were fully restored, following last week’s temporary shutdown due to the Heartbleed bug.

“The RCMP treated this breach of security as a high priority case and mobilized the necessary resources to resolve the matter as quickly as possible,” Assistant Commissioner Gilles Michaud said in a statement.

A search of the suspect’s home led to the seizure of computer equipment. Police provided no further details about the ongoing investigation.

Solis-Reyes is scheduled to appear in court in Ottawa on July 17.

Uncovered early last week by a team of researchers from Google Security and Codenomicon, the Heartbleed weakness has been roaming the Internet for two years, leaving the door to encrypted data and personal information wide open to scammers.

Now, Web-based organizations are scrambling to patch their systems before they become the next Canada Revenue Agency.

Those 900 residents whose data was compromised can expect a registered letter informing them that they’ve been impacted; for added security, the agency will not be making phone calls or sending emails.

It will, however, provide the affected users with free access to credit protection services and will apply additional protections to their CRA accounts to prevent future unauthorized activity.

For more, see PCMag’s Heartbleed: The Complete Rundown. Also check out Heartbleed: How It Works and Heartbleed Bug: Should You Panic?

Also watch PCMag Live in the video below, which discusses the arrest of the 19-year-old hacker.

Scam by Victoria’s Secret clerk highlights the common risk to “Credit-card Skimmers”: The identity-theft threat most of us face routinely!

image

Headlines and the attention of IT professionals have been dominated by Heartbleed recently, yet it’s a news story out of Florida that reminds us of an all-too-common identity-theft threat that most of us face on a routine basis: credit-card skimmers.
image

From an Orlando Sun report:

    Between Nov. 29 and April 3, the (Victoria’s Secret clerk) hid the skimmer under her skirt at Orlando Premium Outlets and swiped customers’ cards before running them through the cash register, according to court documents.

    The woman, whose name is not revealed in court documents, was paid $500 whenever a felon named Alexander Sundeman Sanchez, downloaded card numbers from the device (once a week), records show.

   

“I forgot to tell u i really only want foreigners and tourists,”

Sanchez texted the woman, according to court documents.

Three details struck me: The targeting of victims less likely to contact law enforcement; $500 a week is plenty of temptation for a retail-store clerk with a criminal disposition; and the scam went undetected for months. 

That’s one clerk in one store. Now think of how many times you hand your card to a waiter or insert it into a gas pump or ATM that might also carry a skimmer.

And skimmers – even the fake ATM kind – are not hard for criminals to come by, according to security expert Brian Krebs.

The Secret Service’s website has a description of the more popular skimmer scams and offers this advice for avoiding them:

    Ensure your credit card is swiped only once at a register.
    Conceal your PIN as you enter it into an ATM or credit card reader.

In other words, pay cash.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

RHEL 7 brings forth Project Atomic, kernel improvements, deeper Windows Active Directory integration and more.

image

SAN FRANCISCO — Red Hat Enterprise Linux 7 includes a lot of new features that have IT pros pouring through documentation and sending recaps to their higher-ups, but Red Hat’s goal is to make RHEL boring.

“It’s the opposite of OpenStack, where new releases come to market every six months,” said Brian Stevens, CTO of Red Hat Inc., open-source OS developer. “Only hypercritical changes merit new version numbers. Otherwise, you’d drive IT guys crazy updating their deployments.”

While companies such as Microsoft have promised faster operating system (OS) updates, Red Hat has slowed the release cycle for its flagship OS distribution and to lower operational costs. Red Hat Enterprise Linux (RHEL) development targets a consumable OS and simplicity.

“Users costs aren’t just the OS cost,” Stevens said. “It’s all the costs of configuration, management and provisioning that far outweigh technology replacements.”
RHEL 7 contains its excitement

The RHEL 7 release includes kernel enhancements to natively map with the hardware, such as non-uniform memory architecture mapping and support for up to 5120 logical CPUs and 64TB of physical memory. Other enhancements simplify upgrades from earlier versions of RHEL, and help format the OS for the workload, whether a database server, Web server or other use.

“I’m skipping 6.5 and upgrading straight from RHEL 6.4 to 7,” said Stephen Eaton, a Linux systems administrator for DealerTrack Technologies, Inc., an automotive industry software provider. “It has the features I was looking for.”

One such feature is Ksplice, which allows administrators to patch the kernel without rebooting servers. Eaton’s environment is 24/7, so scheduling downtime is a big deal. He also likes the security and systems management and the ease of use improvements to SELinux.

RHEL has also grown more compatible with Windows OS, as evidenced by Active Directory interoperability in RHEL 7 and more approachable management tools. The question used to be Linux or Windows on servers; now Linux and Windows OSes coexist in the data center, according to the company.

“We will be able to sync Windows domain controllers with RHEL 7 for easier identity management,” said Eaton, whose shop uses Windows and Linux on a mix of virtualized and physical servers.
Project Atomic to create RHEL variant

Red Hat debuted a new community project to develop technologies for creating lightweight Linux Container hosts. Project Atomic will allow creation of a new variant of RHEL — Red Hat Enterprise Linux Atomic Host — as part of RHEL 7.

RHEL 7 also abstracts and isolates applications by deploying them in containers with RHEL Atomic Host. It has strong integration with Docker, which allows applications to be packaged in isolated containers.

RHEL 7 containers keep applications from fighting over resources, which version of Java to use or other factors. The application takes as much of the OS as it needs to be able to move around and perform equally on bare metal, virtualized servers and private and public cloud infrastructures.

North and southbound APIs let the OS interact with the host infrastructure and the application, while providing security and management services. Data centers can virtualize the OS and run the applications in containers atop the virtualization layer to improve standardization in a typical mixed data center, Stevens explained.

Attendees at the Red Hat Summit here this week need some time to digest this Linux container methodology. At the RHEL roadmap session, the majority of attendees said they either see Linux containers at least one year from meaningful adoption in their infrastructure, or couldn’t envision a use case.

“Containers have been around for some time,” said Sander van Vugt, Linux consultant and trainer, who writes for SearchDataCenter. At first sight, containers as the major new thing in RHEL 7 was surprising, he said. “Considering that in RHEL 7 containers are combined with Docker, systemd and cgroups, it … is a big step forward for easy deployment of applications,” on cloud or straight RHEL systems, he said.

Red Hat will update RHEL Atomic Host alongside Red Hat Enterprise Virtualization and Red Hat OpenStack for application consistency across all hosting infrastructures. Project Atomic will feed RHEL Atomic Host.

RHEL 7 release candidate version goes live this week with a final release sometime thereafter. The pricing structure will be similar to previous versions, but pricing was not provided.
Red Hat’s roadmap for Fedora, CentOS

Red Hat continues to push new features and concepts into Fedora and filters the packages that pass muster into new versions and point releases of RHEL. The company also had its largest beta testing community to-date — 10,000 RHEL users — for version 7.

Red Hat also flipped the relationship between the CentOS operating system and RHEL this year. CentOS is an OS for big data, for software-defined networking, Stevens said, and the end users don’t need or want the same kind of support that goes to RHEL users.
CentOS development now goes ahead of RHEL, rather than trailing the Linux distribution, giving Red Hat more feedback to parlay into new RHEL editions and also increases the cloud-friendly, OpenStack nature of RHEL over time.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Mountain View-based real estate crowdfunding platform RealtyShares has raised $1.9 million

Mountain View-based real estate crowdfunding platform RealtyShares has raised $1.9 million in new funding led by General Catalyst Partners.

RealtyShares provides a platform for crowdfunding real-estate development, allowing developers to bypass banks and larger investors to quickly fund projects.

Launched last year, RealtyShares has already helped fund 26 projects in eight U.S. states and will use the new funds to double its team and expand its service.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Palo Alto-based personal health assistant service Better has raised $5 million

image

Palo Alto-based personal health assistant service Betterhas  raised $5 million in seed funding from Social Capital Partnership and The Mayo Clinic.

Better delivers medical information to users via iOS app, connecting users with a personal health assistant if necessary to quickly answer questions or schedule medical appointments.

Founded last year by Geoff Clapp and Chamath Palihapitiya, Better is debuting today with a free basic service and access to a personal health assistant for around $50 per month.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

San Francisco-based urban storage startup Boxbee has raised $2.3 million

image

San Francisco-based urban storage startup Boxbee has raised $2.3 million in new funding from Floodgate CapitalGoogle Ventures500 StartupsTechstarsJason Calacanis, and Ludlow Ventures.

Boxbee delivers boxes for customers to fill with items for storage, transports them to a facility, and catalogues each box for easy retrieval.

A graduate of AngelPad last year, Boxbee is currently available in San Francisco and New York and will use the new funds to scale the service.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Transcendence – Johnny Depp – Opening April 17th Everywhere!

image

Transcendence

(2014)PG-13   119 min 
Drama  |  Mystery  | Sci-Fi  |  Thriller

As Dr. Will Caster works toward his goal of creating an omniscient, sentient machine, a radical anti-technology organization fights to prevent him from establishing a world where computers can transcend the abilities of the human brain.

Director(s): Wally Pfister
Star(s): Johnny Depp
Rebecca Hall
Morgan Freeman 
Cillian Murphy  

Find Showtimes & Tickets

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Transcendence Johnny Depp – In Theaters April 17

image

Transcendence

(2014)PG-13   119 min
Drama  |  Mystery  | Sci-Fi  |  Thriller

As Dr. Will Caster works toward his goal of creating an omniscient, sentient machine, a radical anti-technology organization fights to prevent him from establishing a world where computers can transcend the abilities of the human brain.

Director(s): Wally Pfister
Star(s): Johnny Depp
Rebecca Hall
Morgan Freeman
Cillian Murphy  

Find Showtimes & Tickets  

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Payment sector embraces cuttin-edge Technology including Electronic Currency

image

Now a victim of its own omnipresent success, the global electronic payments industry is increasingly turning to new technologies as it looks to expand its footprint and find new ways to make money by getting consumers to spend theirs.

The pace of technological advancement in the payments market has even caused regulators to take notice, with innovations like cryptocurrencies continuing to grab headlines and attract government scrutiny at the federal and state level. Continue reading…

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Of the 100 largest venture capital rounds on record, 88 were issued within the past five years

image

Companies are staying private longer and investors are eager to write big checks. As a result, enormous venture capital rounds are becoming common. Of the 100 largest venture capital rounds on record, 88 were issued within the past five years, according to CrunchBase, which tracks venture funding. Each delivered more than $50 million to the companies

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Server makers rushing out Heartbleed patches

cropped-security-compliance-011.jpg

Enterprise IT vendors are rushing to protect users from the Heartbleed bug, which has been found in some servers and networking gear and could allow attackers to steal critical data — including passwords and encryption keys — from the memories of exposed systems.

Hewlett-Packard, Dell and IBM have set up pages that identify hardware and software products affected by Heartbleed, which exposes a critical defect in certain versions of OpenSSL, a software library for secure communication over the Internet and networks.

The bug, which was detailed last week, has already been patched in a new version of OpenSSL, but hardware companies are now racing to patch products relying on older versions. Firmware and software patches have been issued for HP’s BladeSystems and IBM’s AIX servers and also Dell’s appliances and networking equipment. In advisories, the server makers have advised customers to investigate hypervisors, OSes and middleware for possible vulnerabilities.

Some HP servers use OpenSSL for encryption and secure communication, and the company is conducting an “aggressive and comprehensive review of all actively supported products” for exposure to the Heartbleed bug, an HP support page said. The security updates are available for free to all customers, an HP spokesman said in an email on Monday.

HP on Sunday issued patches for some versions of server management tools BladeSystem c-Class Onboard Administrator, Smart Update Manager and the System Management Homepage running OpenSSL on Linux and Windows.

HP last week said it had not yet identified networking equipment affected by Heartbleed, but would continue investigating products.

Dell’s PowerEdge servers and OpenManage system management products are not likely affected by Heartbleed. But in a comprehensive Heartbleed advisory, Dell identified system management, security appliances and networking equipment affected by the bug.

Dell is working on patches for the Kace K3000 mobile-device management appliance, some Foglight network appliances and networking equipment running on Dell’s Networking Operating System (FTOS). The company has already issued firmware patches for affected SonicWall security appliances, and the advisory page on Dell’s website will be updated when fixes for more products are released.

IBM has found the Heartbleed bug affecting AIX servers, which use OpenSSL to implement communication across clusters via the TLS (Transport Security Layer) protocol. OpenSSL also enables SSL (Secure Sockets Layer) for secure communication over the Internet.

IBM has issued an OpenSSL patch for servers that shipped with AIX 6.1 OS with the TL9 protocol and AIX 7.1 with the TL3 protocol. IBM is also recommending upgrading to the new OpenSSL version on GPFS (General Parallel File System) versions 3.4 and V3.5 for AIX and Linux for Power and x86 servers. Software including WebSphere MQ, Sametime Community Server version 9 HF1 and Cloudant are affected by the Heartland bug.

IBM in an advisory suggested System Z server customers subscribe to the System z Security Portal for the latest patches and software updates.

 


By: Jarrett Neil Ridlinghafer
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

 

New York-based personal finance education startup LearnVest has raised $28 million

image

New York-based personal finance education startup LearnVest has raised $28 million in a Series D funding led by Northwestern Mutual Capital with Accel Partners and American Express Ventures participating.

LearnVest offers a seven-step plan that helps users cut expenses, budget for goals, and invest their money with the assistance of a certified financial planner.

Debuted in 2009 at TechCrunch50, LearnVest has raised over $72 million to date and will put the new funds toward hiring and scaling its LearnVest at Work product.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

New York-based online investment company Betterment has raised $32 million

image

New York-based online investment company Betterment has raised $32 million in a Series C funding co-led by Citi Ventures,Globespan Capital Partners, and Northwestern Mutual, with participation from previous investors Bessemer Venture PartnersMenlo Ventures, andAnthemis Group.

Betterment currently serves 30,000 customers with its web-based money management platform that automates the role of traditional wealth advisors, and has $500 million in assets under management.

Founded in 2010, Betterment has raised $45 million to date and will use the new investment to expand its product offering.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Seattle-based online legal advice marketplace Avvo has raised $17.5 million

image

Seattle-based online legal advice marketplace Avvo has raised $17.5 million in a Series D round led by Coatue Management with participation from previous investors BenchmarkIgnition Partners, and DAG Ventures.

Avvo is a legal Q&A forum, directory, and legal marketplace that connects hundreds of thousands of consumers with lawyers each month.

Launched in 2007, Avvo has raised $60.5 million in funding to date and will use the latest cash to develop its tools for the legal space and expand its reach globally.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

JVP Leads $7 Million Round in GreenSQ

image

TEL AVIV, Israel, April 14, 2014 /PRNewswire/ —

GreenSQL, a leader in unified database security and compliance, announced today thatJerusalem Venture Partners (JVP),Israel’s leading venture firm in the cyber-security sector, would be investing in the company with participation from all its existing investors including Magma Venture Capital, Rhodium, Atlantic Capital Partners, Gandyr and 2Bangels. Proceeds from the round will be used to support international business expansion and the growing needs of enterprise clients to secure their database applications as they migrate to the cloud.

Advertisement

JVP Leads $7 Million Round in GreenSQL

(Logo: http://photos.prnewswire.com/prnh/20140414/680718 )

Founded in 2009, GreenSQL is a software-based solution installed as afrontendto the database layer, fully camouflaging and securing the database, while eliminating database vulnerabilities in the face of modern day cyber-attacks. GreenSQL has over 150,000 downloads worldwide and provides protection from internal and external threats in real time. The software is the most popular data security and compliance solution worldwide and prevents SQL injection attacks (today’s number 1 data breach method), at the database layer while enforcing the deepest separation-of-duties (SOD) for the most common databases. GreenSQL solutions are easy to install, configure, manage and maintain, and can be up and running within an organization in a matter of hours.

“This strategic partnership with JVP will allow GreenSQL to expand its activities globally and capitalize on our leading position in securing database deployments over cloud platforms like Amazon RDS and Microsoft Azure,”said Amir Sadeh, CEO of GreenSQL. “Our low-touch solutions allow organizations to overcome their top database security and compliance concerns by complying with the latest database regulations requirements, blocking unauthorized database access, and auditing the activities of authorized users. Our solutions can be applied to databases at the corporate or department level, and deployed on-premise or over the cloud.”

JVP’s investment in GreenSQL reinforces its emphasis on identifying leading Israeli cyber-security firms with strong technological advantages and immense growth potential. JVP is the lead investor in several of Israel’s top cyber-security companies including CyberArk, Nativeflow, ThetaRay, CyActive and operates the JVP Cyber Labs incubator in Beer Sheva.

“GreenSQL is at the cutting-edge of database and cloud security and is providing a product that fills a major market need,”said JVP General Partner Gadi Tirosh. “We are thrilled to be adding such a strong company to our cyber portfolio and see this company’s success and ongoing potential as a further testament to the strength of the Israeli cyber-security sector and the innovation that drives it.”

About GreenSQL:

GreenSQLprovides unified database security and compliance solutions for

enterprise companies utilizing both on premises infrastructure and cloud architectures. With an all-in-one approach to database security, the GreenSQL product family helps organizations meet their regulatory compliance requirements by providing a real-time alerts and compliance reports, while also securing the sensitive information from authorized database access. The GreenSQL solution provides database security, compliance and dynamic data masking in a single package. As the most popular database security and compliance solution with over 150,000 copies downloaded in more than 198 countries, GreenSQL is committed to delivering total database security and compliance solutions that are easy to deploy and use.

(http://www.greensql.com)

About Jerusalem Venture Partners:

Jerusalem Venture Partners (“JVP”) is a prominent venture capital fund based in Jerusalem, Israel, with offices in Beer Sheva, New York and Paris. Established in late 1993, JVP has raised close to $1 billion through nine VC funds and was recently ranked by Preqin as one of the top-ten consistently performing VC firms worldwide based on IRR and net returns.JVP invests in Israeli companies and technologies in the areas of digital media, cyber-security and storage. The fund has invested in close to 100 companies.With over 26 noteworthy exits to date, 13 of which through NASDAQ IPOs, JVP has led some of the laregst exits out of Israel. JVP is one of Israel’s leading venture capital firms focusing on cyber security investments, having built up such cyber-security companies as CyberArk, Navajo (acquired by SalesForce), Magnifire (acquired by F5 Networks), ThetaRay and NativeFlow. (http://www.jvpvc.com)

AboutMagma Venture Capital

Magma Venture Capital is a leading Israeli venture capital firm specializing in early-stage investments in communication, semiconductors, internet and new media. Managing Partners Yahal Zilka and Modi Rosen founded Magma in 1999. Since then the firm has sought out innovative companies, and has guided many to successful exits, such as Waze, Provigent, Wintegra, Trivnet, and Phonetic Systems. Our goal is to foster and enable the flow of innovation from an idea’s early stages and through a company’s emergence as a leader within its dynamic market. (http://www.magmavc.com)
About Rhodium

Rhodium invests in early-stage ventures in Israel, New York, and Silicon Valley. Rhodium focuses on identifying and partnering with the very best and most promising entrepreneurs and innovators with a view to building disruptive, world-changing companies, in the fields of advertising, social, mobile, content, commerce and other exceptional technologies.Rhodium’s portfolioincludes companies such as Outbrain,face.com(acquired by Facebook), Hopstop (acquired by Apple), YieldMo, Yotpo and ZooZ.

(http://www.rhodium.co.il)

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Layered Tech Selects Kurt Hoffman as Chief Operating Officer

image

PLANO, Texas, April 15, 2014 /PRNewswire/ — Layered Tech, a leading global provider of secure and compliant cloud and hosting services , announced that Kurt Hoffman has joined the Executive team as its Chief Operating Officer (COO).

As Layered Tech COO, Hoffman will oversee all operations and technology groups within the company, including IT architecture and service delivery, compliance, product management, software development, and customer support. Hoffman will report directly to Layered Tech CEO Bruce Chatterley.

Hoffman comes to Layered Tech with more than 30 years of technology and operations experience and a proven ability to lead high-performance teams in high-growth environments. In his most recent position he served as COO for Hawaiian Telcom, overseeing day-to-day technology, product development, and sales and marketing operations for Hawaii’s leading provider of integrated communications solutions. Prior to that, Hoffman was COO of Speakeasy, playing a leading role during a period of sustained strong growth for the broadband and hosting services provider, including its acquisition by Best Buy in 2007.

Hoffman also brings significant international experience to his new role at Layered Tech, having previously served as Senior Vice President for Network Deployment and Operations for Level 3 Communications. In this position, he managed Level 3’s build out of its fiber optic network across Europe. His international experience also includes his tenure as Sprint/Global One’s Vice President of Global Operations, based in Belgium.

“Kurt Hoffman is an exceptional operating executive and we are excited to have him join the Layered Tech team,” said Chatterley. “Working with our talented operations team, Kurt will focus on delivering the best customer experience in the industry.  Kurt’s deep experience operating in high-growth environments will be critical in maintaining and exceeding our high expectations for quality, as we continue to accelerate Layered Tech’s next phase of growth.”

Hoffman is enthusiastic about his new role, saying, “Layered Tech has created a tremendous platform for growth. The company delivers valuable services focused on today’s critical IT business needs. Layered Tech’s industry-leading offerings in Managed Security, Managed Hosting, and secure Cloud-based applications have allowed it to form partnerships with an impressive list of blue chip customers. I look forward to working with the Layered Tech team to both build on service quality and enhance the customer experience as we embark on our next phase of growth.”

About Layered Tech

Layered Technologies (Layered Tech), a leading global provider of compliant and secure cloud and hosting services offers PCI-, HIPAA- and FISMA-compliant hosting solutions, managed dedicated hosting and cloud computing services, including Compliance Guaranteed, which ensures that all Layered Tech compliance services are guaranteed to pass 100 percent of every IT audit or assessment. By providing high-quality technology, infrastructure and support, Layered Tech enables clients to eliminate capital expenses and save on operating costs so they can focus on core initiatives. Layered Tech’s scalable infrastructure powers millions of sites and Internet-enabled applications, including e-commerce and SaaS solutions. Clients include federal, state and local government agencies; large enterprises with advanced data security, compliance and uptime requirements; and leading-edge Web 2.0 startups. For more information, visit http://www.layeredtech.com .

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Access Selects ClearDATA for HIPAA Compliant Cloud Hosting, Backup, and Disaster Recovery

image

PHOENIX, AZ, Apr 15, 2014 (Marketwired via COMTEX) — ClearDATA Networks, Inc., the leading healthcare cloud computing platform and service provider, today announced that Access, a leading provider of enterprise e-forms management and technology solutions for the healthcare industry, has selected ClearDATA’s HealthDATA(TM) Cloud Computing Platform to host its suite of software solutions.

Under terms of the agreement, Access will offer ClearDATA’s HIPAA compliant hosting, offsite backup, disaster recovery, and information security services to its new and existing customers in order to help them reduce capital expenses, increase productivity, and meet HIPAA security and privacy requirements.

“Hospitals and healthcare organizations are moving towards going paperless to reduce costs and improve productivity, which are the same reasons they are moving to the cloud,” said Mark Johnson, president of Access. “Access is pleased to partner with ClearDATA for cloud hosting services that will benefit our customers while also meeting the highest standards for security, patient privacy, and HIPAA compliance. ClearDATA’s technical expertise, customer satisfaction, and track record with hosting healthcare organizations is unmatched in the industry, and we want nothing but the best for our customers.”

“Access provides vital services for healthcare organizations by helping them create and manage Web-based e-forms that meet HIPAA requirements and integrate with leading healthcare systems and applications,” said Darin Brannan, president of ClearDATA. “ClearDATA is excited to partner with Access to provide their customers with cloud computing services that help them reduce costs, improve productivity, and meet stringent security and privacy requirements.”

About Access Hundreds of hospitals worldwide use paperless Access solutions to integrate e-forms, electronic patient signatures and clinical data into EHRs. Access helps improve care, eliminate financial and environmental costs and enhance patient safety and downtime planning initiatives. Learn more at http://www.accessefm.com and discover how you can help Access’s partner The Last Well bring fresh water & the Gospel to Liberia here. For more information, visit: http://www.accessefm.com/ .

About ClearDATA ClearDATA is the market leader for healthcare cloud computing and information security for healthcare providers and technology companies, and is 100% dedicated to the healthcare field. ClearDATA’s HealthDATA Suite of Cloud Computing Solutions enable providers to fully automate, protect, and securely manage healthcare medical records, applications, IT infrastructure, and digital storage. The company provides HITECH HIPAA-compliant cloud hosting of infrastructure with managed services, offsite backup and disaster recovery, medical image backup and VNA, healthcare I-PaaS, healthcare information security, and world-class delivery and support.

ClearDATA is serving the $35 billion Healthcare IT Market, which is growing at an estimated 20% annually. This industry includes thousands of healthcare providers who must migrate to electronic records and maintain vast amounts of data, which currently exist mostly on paper charts, film or tape. These healthcare providers need storage and processing solutions that can be securely accessed and analyzed anywhere, and at any time.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Peoples Communication uses Skitter to push IPTV into East Texas

image

Peoples Telephone Cooperative (PTC) subsidiary Peoples Communication (PCI), in the midst of a 600-mile fiber transport network expansion in East Texas, has chosen Skitter TV’s advanced IPTV platform as the way it plans to offer a mix of television services.

Skitter’s selection pretty much goes hand-in-hand with the capabilities now available to PCI via a more potent fiber network, PCI CEO Scott Thompson said in a press release.

“Over the past few years, PCI operated different TV services, including a traditional cable system, but these solutions were not economically viable in our markets. We needed a video solution that not only provided our customers with the best technology and programming but also offered PCI a path to profitability.”

That solution turned out to be Skitter, which will deliver a package of television programming that includes the most popular cable and local broadcast channels “in packages that will offer customers a greater selection of standard and high definition programming.”

Skitter’s business case is to offer a turnkey video service that includes encoding, managing, streaming and viewing converged Internet TV, VOD and live broadcast and satellite TV.

Peoples Telephone Cooperative, in addition to offering cable TV, is using the network to deliver broadband, telephone and 4G wireless. PCI will also use Skitter for Connextions Telecom, its competitive local exchange carrier (CLEC).

For more:
– see this press release

Related articles:
At TelcoTV, vendors urge integration with existing IPTV services
SkitterTV offers streaming live TV in Portland, Ore.
SkitterTV opts for Amino’s Aminet hybrid STB
Skitter scores pair of IPTV wins with Pennsylvania Tier 3 operator Venus Telephone

Read more about: Peoples CommunicationPCI

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Shaw Communications will lay off 400 management and non-customer facing employees

image

Shaw Communications will lay off 400 management and non-customer facing employees and hire up to 100 others as part of a companywide restructuring that will, the MVPD hopes, “improve overall efficiency while enhancing its ability to grow as the leading content and network experience company.”

Aside from the layoffs, Shaw will consolidate the operation of its residential cable, satellite, Internet and home phone services under a consumer business unit umbrella. Enterprise services like cable, satellite and tracking will be integrated into Shaw’s media business and managed as a standalone unit.

It is time for a change, said CEO Brad Shaw, noting that the company, like almost every MVPD, is no longer a traditional cable TV provider.

“The roles and structure we established years ago to support us as a cable company can no longer support our growth,” Shaw said in a press release. “We are eliminating duplication of work and organizing our activities and operations in a way that best meets the needs of our customers and viewers.”

Shaw is targeting needs in procurement, supply chain, marketing, pricing, network architecture and next generation products for the 100 or so new workers it will bring aboard.

The restructuring comes on the heels of a second quarter where Shaw lost 20,758 cable TV customers but gained 12,767 high-speed data customers. Its cable division reported 3 percent more year-over-year revenue (US $839 million versus $764.6 million) in the quarter.

For more:
– see this press release

Related articles:
Shaw revenues up even as cable subscriber numbers continue to fall
Shaw drops 29,522 video subscribers in fiscal Q4
Canada to propose forcing a la carte programming
Shaw to acquire business ISP ENMAX Envision

Read more about: Canadalayoffs

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Mediacom hikes broadband speeds to 305 Mbps to compete with CenturyLink

image

Facing competition from telco CenturyLink (NYSE: CTL) in Cedar Rapids, Iowa, Mediacom is reportedly looking to raise its broadband speeds to 305 Mbps via what it calls the Ultra 305 level of service.

For a reported $199.95 residential customers will almost triple the speeds of the MSO’s previous fastest broadband service, Ultra 105. They’ll also get a 4 terabyte data usage allowance before being impacted by overage fees, a company spokesman told Multichannel News.

Attaching data limits is nothing new for the MSO. Ultra Plus 105 customers got a 2 terabyte monthly allowance. Those who want something a little more get 3 terabytes and everyone can buy 50 gigabytes for $10 when they hit their monthly rates.

According to the story, Mediacom has been testing the fast service, which offers upstream speeds of 10 Mbps, since early this year in Cedar Rapids and adjoining communities Marion, Hiawatha, Bertram and Toddsville. Mediacom, which did not reveal any future rollouts, said that it has DOCSIS 3.0 deployed to 98 percent of its service area.

For more:
– Multichannel News has this story

Related articles:
Mediacom to deploy Pace MG1 gateway with TiVo interface
AT&T to bring 1 Gbps FTTH service to North Carolina
Verizon gives 1 GB of free data to ‘More Everything’ tablet customers
Mediacom creating 40 jobs at new corporate headquarters; Time Warner Cable upgrades My TWC app

Read more about: broadband speedsCenturyLink

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Emboldened Russian Military continues to Kick Sand In the face of the 200lb Weakling America has become under Obama

image

President Obama told Russia’s Vladimir Putin on Monday that the U.S. had ‘grave concern’ about Moscow’s aggression in Ukraine — as pro-Russian protesters stormed government buildings in what U.S. officials said appears to be a coordinated effort backed by Moscow.  A senior administration official said the call between the two world leaders was ‘frank and direct,’ and was at the request of the Russians. The White House said Obama told Putin Russia’s support of pro-Russia separatists in Ukraine was a matter of ‘grave concern,’ and urged Putin to convince the forces to leave the buildings they have seized. ‘The president made clear that the diplomatic path was open and our preferred way ahead, but that Russia’s actions are neither consistent with or conducive to that,’  the official said.  The White House said Obama also told Putin he believes a diplomatic solution cannot succeed as long as the Russian government continues its aggression in Ukraine.  The Kremlin also issued a statement about the phone call, saying Putin urged Obama to use the U.S.’ capabilities to prevent bloodshed in the region. Putin told Obama that concerns about meddling in southeastern Ukraine are speculations based on ‘inaccurate information.’’
 
Buzzed off – The call came after a Russian fighter jet buzzed a U.S. warship in the Black Sea, making 12 passes by the USS Donald Cook over the span of 90 minutes. A Russian warship was also reportedly shadowing the Navy vessel during the incident. A U.S. spokesman called the episode “unprofessional.” The USS Donald Cook returned to port following the flybys. 

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Quality American Products Utilizing 3D Printed Manufacturing, Giving Cheap Chinese Products & Slave Labor A Run for their Money

image

 Every day, the young, bearded, and tattooed team at the Shapeways factory in Long Island City, Queens gather to eat together, as if they lived in a 19th century company town. There aren’t many dining options in the industrial area in which they work, so they order takeout pizza or food from FreshDirect and convene. Their routine is fitting: Shapeways, a three-dimensional printing company, tips its hat to the industrial past even as it seeks to reinvent manufacturing.

On almost every flat surface in Shapeways’ open plan office are whimsical creations — multicolored figurines of humanoid creatures, skeletons of imaginary animals, abstract sculptures that appear inspired by nature’s curving forms. There is even a fractal design of a deer’s head that looks like the frame on which you’d hang the real thing.

All of this illustrates the powerful potential that 3-D printing has to create an infinity of objects. The technology that makes 3-D printing possible has existed for decades, often employed as a prototyping tool in sophisticated industries such as aerospace. More recently, the do-it-yourself community and other hobbyists in the so-called Maker movement have adopted it as a way to create novel items. Now that the technology is no longer in doubt, Shapeways is betting that 3-D printed objects have a place in every home.

image

Shapeways was founded in the Netherlands in 2007 and incubated by the electronics giant Philips. It hosts an Etsy-like online marketplace of more than 13,500 online storefronts where designers showcase countless products, from figurines and credit card holders to jewelry and kitchenware. Last year, it raised $30 million in venture capital from heavyweight firms including Andreessen Horowitz. It’s now headquartered in Manhattan, across the river from the factory.

The Shapeways marketplace works like this: Customers search for items at the Shapeways site. After finding a design of interest and selecting a preferred material, the request is sent to staffers at the Shapeways factory, who determine if it is feasible. Then the printers whirr into action. Shapeways didn’t develop its 3-D printing technology, but it turns it into a consumer-facing business by manufacturing creations, reviewing them for defects, and dyeing or polishing them into finished products before shipping them.

The term “3-D printing” doesn’t refer to a specific process so much as a category of processes that are each capable of creating one-of-a-kind items. Desktop 3-D printers often rely on an “extruding” process in which a material is fired into a shape and then solidified. It’s sort of like a glue gun shooting both the glue and whatever it’s holding together.

Here at the Shapeways factory, the company uses a process called laser sintering in which each item gets fused together layer by layer. The material starts in a powdered form, and when a laser scans across it, it hardens into a layer that will, in aggregate, comprise the final product.

Mainstream manufacturing processes like injection molding are very good at producing an infinite number of identical products, but there are physical limitations to what can be created. 3-D printing changes the manufacturing process in at least two interesting ways: It allows for the creation of incredibly intricate designs (such as a pre-assembled chain of links) and makes it easier and less expensive to create one-off or limited-run creations. The design software acts as a conduit between the designer’s brain and the printer, limited only by her imagination and the laws of physics.

For example, an artificial tree printed in 3-D would begin at the bottom and add parts of the trunk, branches, and leaves with each successive layer — regardless of the complexity of each horizontal. Since the structure of a tree is so complex, an artificial tree produced with standard manufacturing technology might involve the creation of separate molds for the trunk, branches, and each individual leaf, and then require assembly.

With 3-D printing there’s “no cost of complexity,” Carine Carmy, Shapeways’ marketing director says.

The printers at Shapeways’ factory are produced by a company called EOS. The machines are the size of large refrigerators with a window on the front to peer inside. When the author glanced in during a tour of the space, he saw several items under construction — though it was difficult to make out just what they might be.

Beyond the printers is a rock tumbler the size of a round dinner table, used to smooth out the rough edges of a newly printed item. When it is turned on, the machine is so loud that you can’t stand to be in the room with it.

At the end of the line, there are bins for sorting. Like the rest of the factory it seemed relatively quiet during a recent visit, suggesting that mid-winter isn’t the high season for individualized products.

Shapeways chief executive Peter Weijmarshausen says the Shapeways marketplace, which launched in 2009, is like Apple’s App Store in that both facilitate creative entrepreneurship. Weijmarshausen says his company sold 1.2 million pieces last year, and he expects volume to triple this year. (He declined to comment on revenue.)

Despite the hype around 3-D printing, you need only look around the American home or office to see that this industrial process has not yet insinuated itself in daily life. But it’s already in use by major U.S. manufacturers: General Motors and Ford, for example, both use the technology to speed up the design and prototyping process.

The prospect of individualized manufacturing has lots of people excited. Some of the more feverish prognosticators say it could upend the global supply chain and reshuffle the geopolitics. It could revolutionize medical devices, and much else.

And so the technology has attracted business interest. Last year Stratsys acquired MakerBot, which manufactures a relatively affordable desktop 3-D printer, in a deal that could ultimately be worth more than $600 million. While MakerBots could theoretically become household items, Shapeways is instead charting a path of decentralized creativity but centralized manufacturing. Shapeways must now figure out how to sell unique and beautiful items to consumers who don’t fetishize the process of 3-D printing.

Weijmarshausen says that the technology has already cleared several barriers to adoption. 3-D printers were once “horrendously expensive” and required highly specialized software; they are more accessible now. The next step is relevance, he says, which “comes from great stories that people care about.” (Because while a 20-sided die may be inventive, it’s still as practical as, well, a 20-sided die.)

As it grows, Shapeways has to conquer what Carmy calls the “blank page” problem: Just because customers say they want unique things most doesn’t mean they will devote much time or effort to acquire them. And products that encourage personalization, like last year’s Moto X smartphone, don’t always succeed even when the options are far more limited.

To address this, Shapeways has introduced several web apps to facilitate creativity. With one program, users can turn a drawing into a 3-D bauble. Shapeways has also released apps to let users customize a ring or sake set, as well as a toolbox for more sophisticated designers. Shapeways also announced a partnership with Adobe that lets users upload and manufacture Photoshop creations.

“Everyone is more creative than what we call ‘big brands’ today,” Weijmarshausen says. “No more lowest common denominator.”

It’s a fine goal, but difficult to achieve in practice. Kostika Spaho, a designer who has sold products on Shapeways for several years, has designed a line of otherworldly coffee cups and co-designed a “biomimicry shoe” inspired by the shape of a bird’s skull. But his best-selling products on Shapeways are based on Internet memes. His No. 1 product, he says, is this obstreporous anteater, “Due to the fact that it’s already popular and a lot of people know of it.”

“Instead of thinking of what products I should make, I go to Reddit and ask people what they want,” he says. This has led to such innovations as a desktop catapult that clamps to the corner of the desk. “It’s so ridiculous, but this is what they want.”

Spaho pauses. “I would like to make sexy, beautiful products like the cups and the shoes,” he says. “But they don’t sell.”

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Tax Day is now also ObamaCare Day as Billions to Subsidize the already Wealthy Insurance Companies Get Picked from your Pocket Today

image

Tax Day is now also ObamaCare Day –

For many Americans, today marks their first contact with ObamaCare besides perhaps hearing that not a single legislator read the massive 2000+ page bloatbill, as new taxes to finance the program’s insurance subsidies kick in. With its dismally performance as far as new subscribers to the bloated socialist medical system Obama forced down the American voters throats against their own wishes (using political games and loopholes to push it through without a vote) New income taxes on average hard working American Families making a measly $100k and above, an increase in federal payroll taxes (killing small business growth at a time we need to be stimulating business!) and new taxes on investment income (the money invested in businesses, again slowing business growth) will help pump an estimated $20.5 billion into the program while the rest of America suffers in seething silence (Only a dismal 25% of Americans think ObamaCare is a good idea)

National Journal explains the unwelcome surprises for taxpayers:

“Most people won’t notice the extra Medicare tax because it was automatically deducted from their paychecks, but many will face a hefty tax bill they did not expect”

said Jackie Perlman, principal tax research analyst at the H&R Block Tax Institute.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

FOOD STAMP RECIPIENTS OUTNUMBER WORKING WOMEN

image

FOOD STAMP RECIPIENTS OUTNUMBER WORKING WOMEN 
CNSNews: “People participating in the food stamp program outnumbered the women who worked full-time, year-round in the United States in 2012, according to data from the Department of Agriculture and the Census Bureau…

For each woman who worked full-time, year-round in 2012, there was slightly more than 1 other person collecting food stamps.”

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Top Cloud Service Providers… Absolutely Horrible Speed, Latency & Consistency

image

Recently an article was posted by someone else who claimed these numbers as “showing how good” the CSP speed was however the truth is just the opposite if you know anything about network infrastructure and the types of consistency and latency a real private cloud Data-Center/Infrastructure can generate for your business, these results are absolutely horrendous and one of the primary reasons, if you made the mistake of throwing your entire business SaaS or otherwise onto the public cloud, why your experiencing so many complaints from your customers or the one administrator you hired is scratching his head and pointing his finger at the software development team… Little does he realize, the marketing hype that states the cloud is “perfect and without fault” is nothing but exactly that…. A BUNCH OF MARKETING HYPE

I found your business DEPENDS on performance and consistency of performance then the LAST PLACE you should ever put it is in a fully managed hosted environment, and that is exactly what these services are, they may not be fully managed but they still allow zero input or even insight into normal everyday aspects of any performance network such as routing, backbone connectivity, caching, utilization levels, saturation levels, etc… Not even going to go into the massive security risks associated with these types of environments.

There are a number of hidden sites around the Internet which actually monitor things like Amazon Uptime Percentage which, is a lot worse than the advertised 96% (about as bad as you can get and still be in business) and their massive latency issues which have been a constant complaint for years now and one of the things I am called on to investigate, as an executive consultant often.

To the numbers now, and the following text was taken from the article and not written by me, I have removed the authors commentary as I said he is an Amazon vendor and his bias was glaringly evident in his confused take on  these absolutely terrible results, results I would have been ashamed of when I owned my own ISP for five years in Los Gatos, CA beginning in 1999:
Methodology

The testing was done using the iperf tool on Linux. One server acts as the client and the other as the server:

Server: iperf -f m -s

Client: iperf -f m -c hostname

The OS was Ubuntu 12.04 (with all latest updates and kernel), except on Google Compute Engine, where it’s not available. There, I used the Debian Backports image.

The client was run for three tests for each type – within zone, between zones and between regions – with the mean average taken as the value reported.

Amazon networking performance

t1.micro (1 CPU)
135 Mbits/sec
101 Mbits/sec
19 Mbits/sec

c3.8xlarge (32 CPUs)
7013 Mbits/sec
3395 Mbits/sec
210 Mbits/sec

us-east-1 zone-1a us-east-1 zone-1aus-east-1 zone-1a us-east-1 zone-1dus-east-1 zone-1a us-west-1 zone-1a

Amazon’s larger instances, such as the c3.8xlarge tested here, support the enhanced 10 GB networking, however you must use the Amazon Linux AMI (or manually install the drivers) within a VPC. Because of the additional complexity of setting up a VPC, which isn’t necessary on any other provider, I didn’t test this, although it is now the default for new accounts.

However, the consistency of the performance wasn’t so good. The speeds changed quite dramatically across the three test runs for all instance types, much more than with any other provider.

You can use internal IPs within the same zone (free of charge) and across zones (incurs inter-zone transfer fees), but across regions, you have to go over the public internet using the public IPs, which incurs further networking charges.

Google Compute Engine networking performance

f1-micro (shared CPU)
692 Mbits/sec
905 Mbits/sec
531 Mbits/sec
140 Mbits/sec
137 Mbits/sec

n1-highmem-8 (8 CPUs)
2976 Mbits/sec
3042 Mbits/sec
2678 Mbits/sec
154 Mbits/sec
189 Mbits/sec

us-central-1a us-central-1aus-central-1b us-central-1bus-central-1a us-central-1bus-central-1a europe-west-1aus-central-1b europe-west-1a

Google doesn’t currently offer an Ubuntu image, so instead I used its backports-debian-7-wheezy-v20140318 image. For the f1-micro instance, I got very inconsistent iperf results for all zone tests. For example, within the same us-central-1a zone, the first run showed 991 Mbits/sec, but the next two showed 855 Mbits/sec and 232 Mbits/sec. Across regions between the US and Europe, the results were much more consistent, as were all the tests for the higher spec n1-highmem-8 server. This suggests the variability was because of the very low spec, shared CPU f1-micro instance type.

I tested more zones here than on other providers because on April 2, Google announced a new networking infrastructure in us-central-1b and europe-west-1a which would later roll out to other zones. There was about a 1.3x improvement in throughput using this new networking and users should also see lower latency and CPU overhead, which are not tested here.

Although 16 CPU instances are available, they’re only offered in limited preview with no SLA, so I tested on the fastest generally available instance type. Since networking is often CPU bound, there may be better performance available when Google releases its other instance types.

Google allows you to use internal IPs globally

Rackspace networking performance

512 MB Standard (1 CPU)
595 Mbits/sec
30 Mbits/sec
13 Mbits/sec

120 GB Performance 2 (32 CPUs)
5539 Mbits/s
534 Mbits/s
88 Mbits/s

Dallas (DFW) Dallas (DWF)Dallas (DFW) North Virginia (IAD)Dallas (DFW) London (LON)

Rackspace does not offer the same kind of zone/region deployments as Amazon or Google so I wasn’t able to run any between-zone tests. Instead I picked the next closest data center. Rackspace offers an optional enhanced virtualization platform called PVHVM. This offers better i/o and networking performance and is available on all instance types, which is what I used for these tests.

Similar to Amazon, you can use internal IPs within the same location at no extra cost but across regions you need to use the public IPs, which incur data charges.

When trying to launch x2 120 GB Performance 2 servers at Rackspace, I hit our account quota (with no other servers on the account) and had to open a support ticket to request a quota increase, which took them about an hour and a half to approve. For some reason, launching servers in the London region also requires a separate account, and logging in and out of multiple control panels soon became annoying.

Softlayer networking performance

1 CPU, 1 GB RAM, 100 Mbps
105 Mbits/sec
105 Mbits/sec
29 Mbits/sec

8 CPUs, 2 GB RAM, 1 Gbps
911 Mbits/s
921 Mbits/s
61 Mbits/s

Dallas 1 Dallas 1Dallas 1 Dallas 5Dallas 1 Amsterdam

Softlayer only allows you to deploy into multiple data centers at one location: Dallas. All other regions have a single facility. Softlayer also caps out at 1 Gbps on its public cloud instances, although its bare metal servers do have the option of dual 1 Gbps bonded network cards, allowing up to 2 Gbps. You choose the port speed when ordering or when upgrading an existing server. They also list 10Gbit/s networking as available for some bare metal servers.

Similarly to Google, Softlayer’s maximum instance size is 16 cores, but it also offers private CPU options which give you a dedicated core versus sharing the cores with other users. This allows up to eight private cores, for a higher price.

The biggest advantage Softlayer has over every other provider is completely free, private networking between all regions whereas all other provider charge for transfer out of zone. When you have VLAN spanning enabled, you can use the private network across regions, which gives you an entirely private network for your whole account.

Excerpts written by David Mutton CEO of “Server Density” a large Cloud service vendor, no wonder he is “touting” these as “good” numbers

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

“Extremely High Risk to Businesses Using Cloud Services” of falling victim to the Heartbleed security vulnerability, according to experts

image

Businesses using cloud services are at “extremely high” risk of falling victim to the Heartbleed security vulnerability, according to experts.

The Heartbleed bug was first found earlier this week and is a vulnerability in OpenSSL — technology used to protect sensitive data — that allows attackers to hack into software. Since it reared its head, security experts have warned users of cloud services to change their passwords to mitigate the risk.

But it is not just consumers who are at risk, according to cloud security specialist Skyhigh Networks, which claims enterprises face a similarly serious situation.

“While the focus in the media was initially on high-profile consumer sites such as Yahoo Mail, many cloud services present an even greater risk to companies storing sensitive data on those services,” officials with the company said. ”Over the past weeks, security teams across country have been grappling with end of life for Windows XP… [but] that issue has been completely overshadowed with news of the Heartbleed vulnerability.”

Skyhigh Networks said its intelligence shows that 24 hours after the vulnerability hit the headlines, 368 cloud providers had still not patched their wares, making them vulnerable to attack. It did not divulge which firms’ services were affected but claimed “leading backup, HR, security, collaboration, CRM, ERP, cloud storage, and backup services” were among them.

“The average company uses 626 cloud services, making the likelihood they use at least one affected service extremely high,” officials added.

ALERT
Where these officials get their data is beyond me however there is nowhere close to any company that uses 600 Cloud services, there’s not even that many Cloud Service Providers in the entire world!

CHANGING YOUR PASSWORD WILL HAVE ABSOLUTELY ZERO AFFECT (I’m not sure what “experts” this article refers to but they obviously are NOT SECURITY EXPERTS) the issue is in the secured SSL protocol itself, NOT YOUR PASSWORD, so if you are using a hosting service or Cloud service for ANYTHING you need to contact that provider and ask if they have fixed all their OPENSSL implementations and if you do not believe you can trust them to answer truthfully I encourage you to REMOVE ANY AND ALL CRITICAL INFORMATION from that service and then hire someone like myself or other trusted security expert to do a complete analysis for you before you place your business data at risk outside of your internal firewall again…

This is NO JOKE, absolutely EVERY INTERNET SERVICE IS NOW COMPROMISED, NO MATTER IF THEY HAVE EMAIL YOU OTHER WISE AND NO ACCOUNT OF CHANGING YOUR PASSWORD WILL FIX THIS which is why the true experts are saying it’s the WORLD SECURITY VULNERABILITY EVER IN THE HISTORY OF THE INTERNET..

CALL ME if you are not sure if your company is at risk and I can help you protect yourself and your business immediately 209-263-2976

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Kentucky Enacts a Data Breach Notification Law and Protects Student Data in the Cloud

image

Kentucky Gov. Steve Beshear signed H.R. 232 on April 10, 2014, making the Commonwealth the 47th state to enact a data breach notification law. The law also limits how cloud service providers can use student data. A breach notification law in New Mexico may follow shortly.

Data Breach Notification Mandate

The Kentucky law follows the same general structure of many of the breach notification laws in the other states:

A breach of the security of the system happens when there is unauthorized acquisition of unencrypted and unredacted computerized data that compromises the security, confidentiality, or integrity of personally identifiable information maintained by the information holder as part of a database regarding multiple individuals that actually causes, or leads the information holder to reasonably believe has caused or will cause, identity theft or fraud against any resident of Kentucky. The law does not refer to “access” only acquisition, and appears to have a risk of harm trigger.

The good faith acquisition of personally identifiable information by an employee or agent of the information holder for the purposes of the information holder is not a breach if the personally identifiable information is not used or subject to further unauthorized disclosure.

“Personally identifiable information” means an individual’s first name or first initial and last name in combination with the individual’s (i) Social Security number, (ii) Driver’s license number; or (iii) Account number, credit or debit card number, in combination with any required security code, access code, or password permit access to an individual’s financial account.

The notification required under the law must be made in the most expedient time possible and without unreasonable delay, consistent with the legitimate needs of law enforcement or any measures necessary to determine the scope of the breach and restore the reasonable integrity of the data system.

Notice may be provided in writing and can be provided electronically if the E-Sign Act requirements are met. For larger breaches, the law also contains substitute notice provisions similar to those in other states.

If notification is required to more than 1,000 Kentuckians at one time under this law, all nationwide consumer reporting agencies and credit bureaus also must be notified of the timing, distribution and content of the notices. However, the law does not require the Kentucky Attorney General to be notified of the incident, as is the case in a number of other states such as California, Maryland, Massachusetts, New Hampshire, and New York.

The law excludes persons and entities that are subject to Title V of the Gramm-Leach-Bliley Act of 1999 and the Health Insurance Portability and Accountability Act of 1996 (HIPAA). Of course, covered entities, business associates and certain vendors have their own breach notification requirements.

Protections for Student Data In the Cloud

The law is designed to protect student data at educational institutions, public or private, including any administrative units, that serve students in kindergarten through grade twelve when stored in the “cloud”. We may see more of these kinds of laws, particularly in light of the Fordham Law School study on the topic. For purposes of this law, “student data” means

any information or material, in any medium or format, that concerns a student and is created or provided by the student in the course of the student’s use of cloud computing services, or by an agent or employee of the educational institution in connection with the cloud computing services. Student data includes the student’s name, email address, email messages, postal address, phone number, and any documents, photos, or unique identifiers relating to the student.

Cloud providers serving these institutions in Kentucky need to be aware of this law not only so they can take steps to comply, but because it requires the providers to certify in their services contracts with the educational institutions that the providers will comply with this new law.

Specifically, the law prohibits cloud computing service providers from “processing student data for any purpose other than providing, improving, developing, or maintaining the integrity of its cloud computing services, unless the provider receives express permission from the student’s parent.” Processing is defined pretty broadly, it means to “use, access, collect, manipulate, scan, modify, analyze, transform, disclose, store, transmit, aggregate, or dispose of student data.”

While the provider may assist an educational institution with certain research permitted under the Family Educational Rights and Privacy Act of 1974, also known as “FERPA,” it may not use the data to “advertise or facilitate advertising or to create or correct an individual or household profile for any advertisement purpose.” Finally, the provider may not sell, disclose, or otherwise process student data for any commercial purpose.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

US takes out gang that used Zeus malware to steal millions

image

The US Department of Justice today charged nine members of a group that used Zeus malware to infect thousands of business computers with Zeus malware and illegally siphon-off millions of dollars into over-seas bank accounts.

The DoJ said an  indictment was unsealed in connection with the arraignment this week at the federal courthouse in Lincoln, Neb.,  of two Ukrainian nationals, Yuriy Konovalenko, 31, and Yevhen Kulibaba, 36.  Konovalenko and Kulibaba were recently extradited from the United Kingdom.  All of the defendants had been charged by a federal grand jury in August 2012 with conspiracy to participate in racketeering activity, conspiracy to commit computer fraud and identity theft, aggravated identity theft, and multiple counts of bank fraud.

According to the indictment, the defendants participated in an enterprise and scheme that installed, without authorization, malicious software known as Zeus or “Zbot” on victims’ computers  associated with Bank of America, First National Bank of Omaha, Nebraska, the Franciscan Sisters of Chicago and Key Bank.

The defendants are charged with using that malicious software to capture bank account numbers, passwords, personal identification numbers, RSA SecureID token codes and similar information necessary to log into online banking accounts.  The indictment alleges that the defendants falsely represented to banks that they were employees of the victims and authorized to make transfers of funds from the victims’ bank accounts, causing the banks to make unauthorized transfers of funds from the victims’ accounts, the DoJ stated.

As part of the enterprise and scheme, the defendants allegedly used US residents as “money mules” who received funds transferred over the Automated Clearing House network or through other interstate wire systems from victims’ bank accounts into the money mules’ own bank accounts.  These money mules then allegedly withdrew some of those funds and wired the money overseas to conspirators, the DoJ stated.

According to court documents unsealed today, Kulibaba allegedly operated the conspirators’ money laundering network in the United Kingdom by providing money mules and their associated banking credentials to launder the money withdrawn from U.S.-based victim accounts.  Konovalenko allegedly provided money mules’ and victims’ banking credentials to Kulibaba and facilitated the collection of victims’ data from other conspirators.

The DoJ noted that four identified defendants remain at large:

    Vyacheslav Igorevich Penchukov, 32, of Ukraine, who allegedly coordinated the exchange of stolen banking credentials and money mules and received alerts once a bank account had been compromised.

    Ivan Viktorvich Klepikov, 30, of Ukraine, the alleged systems administrator who handled the technical aspects of the criminal scheme and also received alerts once a bank account had been compromised.
    Alexey Dmitrievich Bron, 26, of Ukraine, the alleged financial manager of the criminal operations who managed the transfer of money through an online money system known as Webmoney.
    Alexey Tikonov, of Russia, an alleged coder or developer who assisted the criminal enterprise by developing new codes to compromise banking systems.

The indictment also charges three other individuals as John Doe #1, John Doe #2 and John Doe #3.

From a recent Network World story: Zeus is the top banking Trojan, according to Dell SecureWorks, which made major discoveries about criminally-operated botnets based on the malware that date back to 2007. Zeus is often described as sophisticated banking Trojan malware that can execute an array of financially-oriented attacks, such as grabbing online credentials and siphoning off funds in payment systems.

According to the SecureWorks report, “Top Banking Botnets of 2013,” Zeus banking Trojan variants accounted for about half of all banking malware seen in 2013.  SecureWorks points out that Zeus is now being used not just to attack financial institutions but also stock trading, social-networking and e-mail services, plus portals for entertainment or dating, for example.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

France Bans Emails Sent later than 6pm!

image

digital era, it can be difficult to truly clock out at the end of the day. Smartphones buzz with work email until the wee hours, particularly if you have a global team in multiple time zones.

If one of those team members happens to live in France, though, make sure you send them any important messages before 6 p.m. local time. As noted by The Guardian, French trade unions have negotiated an agreement that puts the kibosh on any after-hours communication. Going forward, employees cannot be penalized if they do not respond to a work messages during off hours.

The deal is an amendment to a 1999 agreement that mandated a 35-hour work week in the country, LesEchos.fr reported.

Given the reach of the global economy, the rules mean that the French offices for companies like Google and Facebook will be affected, The Guardian said. Presumably, however, someone would only get in trouble if a complaint was filed; it’s not as if work emails and phones would be deactivated after 6 p.m.
The Silent Type
Emperor 200
Mico Headphones
SimplyNoise
Power Nap Head Pillow
Confession
VIEW ALL PHOTOS IN GALLERY

In 2012, Google executive chairman Eric Schmidt was the commencement speaker at Boston University, where he urged graduates to unplug for at least an hour a day. “Take one hour a day and turn that thing off,” Schmidt said at the time. “Take your eyes off that screen and look into the eyes of the person you love. Have a conversation, a real conversation.”

In the U.S., the National Day of Unplugging was held last month. That’s a far cry from the 133 hours the French could possibly unplug, but not all of our time spent on smartphones, tablets, and PCs are work-related, of course.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Aereo while battling for life in US Supreme Court, Going ahead with its Google App Launch

image

Aereo, the cloud-based antenna and DVR technology currently in a death struggle with broadcasters and the U.S. Justice Department over the right to exist, is moving ahead with plans to become an app in the Google (NASDAQ: GOOG) Play store.

If all goes according to plan, starting May 29, current Aereo subscribers and others living in Aereo markets will be able to download the app for Android, including support for Chromecast, the company said in a press release. The Google application adds to existing apps for Apple (NASDAQ: AAPL) iPad, iPhone and iPod Touch as well as Chrome for Mac, Chrome for Windows, Safari, Internet Explorer 9, Firefox, Opera Software, AppleTV and ROKU.

“The way people watch and experience television is changing and Google is a pioneer in providing consumers with more choice and flexibility in how they access and experience the media,” Aereo CEO and founder Chet Kanojia said in the press release. “Consumers deserve more options and alternatives in how they watch television and our team is committed to providing consumers with the best experience possible using Aereo’s innovative cloud technology.”

If that cloud technology continues to be available. Broadcasters, who claim Aereo violates their copyrights and their ability to sell the rights to their content via retransmission agreements, disagree and have taken the matter to the courts.

Kanojia himself admitted that the company could cease to exist if it loses a battle that is now going before the U.S. Supreme Court. Broadcasters have the support of the Justice Department in claiming that Aereo gives consumers access to copyrighted content and doesn’t pay licensing fees for that content. Aereo, meanwhile, recently picked up an endorsement from the American Cable Association (ACA) representing smaller cable operators, although major operators have remained silent on the battle.

Last month Kanojia was candid when asked about his company’s future during a Bloomberg Webcast, saying, “If we don’t succeed, despite our best efforts and the good law being on our side, it would be a tragedy. But it is what it is.”

For an $8 monthly fee Aereo subscribers in New York, Boston, Atlanta, Miami, Houston, Dallas, Detroit, Baltimore, Cincinnati, San Antonio and Austin get access to local broadcast programs received via an over-the-air antenna and streamed to their device. They can also record and play back programming via a cloud-based DVR, and can increase the amount of recording time (and add a second tuner) for $12 monthly.

For more:
– see this press release

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Identity Theft on the Rise, over 200% increase since 2012 IRS Warns

image

While tax return fraud seems to have hit epidemic proportions, the Internal Revenue Service today said it has started more than 200 new investigations this filing season into identity theft and refund fraud schemes.

The agency’s Criminal Investigation unit has started 295 new identity theft investigations since January, pushing the number of active cases to more than 1,800.

The effort by the unit is part of a larger effort at the IRS to combat identity theft and refund fraud by pursuing identity thieves, preventing fraudulent refunds from being issued and helping victims of this crime.

The IRS said that since the start of 2014, increased activity by CI has led to more prosecution recommendations, indictments and sentencing hearings, which reflect the overall success by the IRS on the increased number and effectiveness of ID theft filters used during the processing of tax returns. Highlights of this year’s work include:

A new and key component for IRS efforts this year is to investigate the misuse of Electronic Filing Identification Number (EFIN). An EFIN is assigned to tax preparers that have completed the IRS e-file Application to become an Authorized IRS e-file Provider. After the provider completes the application and passes a suitability check, the IRS sends an acceptance letter, including the EFIN, to the provider.

Since the start of the fiscal year through March 31, 2014, the IRS has revoked or suspended 395 EFINS based on recommendations from CI, and CI has initiated 60 EFIN source investigations involving EFINs used by individuals involved in refund fraud and identity theft schemes. By revoking and suspending the EFINs, IRS can prevent the transmission of the fraudulent tax returns, the IRS stated.

In Fiscal Year 2013, the IRS initiated approximately 1,492 identity theft related criminal investigations, an increase of 66% over investigations initiated in 2012. Direct investigative time applied to identity theft related investigations has increased 216%  over the last two years. Prosecution recommendations, indictments, and those convicted and sentenced for identity theft violations have increased dramatically since FY 2011. Sentences handed down for convictions relating to identity theft have been significant, ranging from two months to 317 months.

The IRS detailed some recent cases including:

• On March 27, 2014, A Miami man was convicted by jury of one count of access device fraud and five counts of aggravated identity theft. According to the indictment and evidence, the defendant obtained an IRS Electronic Filing Identification Number and used it to file 52 fraudulent tax returns, many filed with stolen identities.

• On Feb. 27, 2014, in Tampa, Fla., two defendants were sentenced to 121 months and 192 months in prison, respectively. As part of their sentence, the court entered a $790,421 money judgment against each, as well as $790,421 in restitution. Both pleaded guilty to conspiring to commit wire fraud and aggravated identity theft. According to court documents, the defendants and others orchestrated a scheme to defraud the United States Treasury by causing fraudulent federal income tax returns to be filed using stolen identities, and soliciting personal identifying information and addresses from co-conspirators in Florida and Georgia. To facilitate the scheme, the conspirators coordinated the withdrawal of fraudulently obtained tax refund amounts from prepaid debit cards. The identities used to file the fraudulent tax returns in this scheme belonged to individuals living in various states across the country. As part of the conspiracy, at least 322 federal income tax returns for tax year 2011 were filed claiming refunds of $2,701,844.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Facebook faces class action suit in Canada over interception of private messages

image

Facebook is facing a class-action lawsuit in Canada over its alleged interception of private message of users of the social network.

The lawsuit in the Ontario Superior Court alleges that URLs (uniform resource locators) in the private messages were “harvested” by Facebook in violation of its users’ privacy, without their knowledge or consent, Rochon Genova, the law firm representing the users, said Wednesday.

Facebook did not disclose to users that their private messages would be intercepted and scanned, and the contents of those messages treated as “likes” for third-party sites through the social plug-in function, according to the law firm.

The complaint is without merit and we will continue to defend ourselves vigorously, a spokeswoman for the social networking company said via email.

The company is already facing similar lawsuits in the U.S. over its alleged interception and scanning of the content of private messages.

Citing research by Swiss information security firm High-Tech Bridge and others, Facebook users Matthew Campbell and Michael Hurley filed in December a suit in the U.S. District Court for the Northern District of California on behalf of all Facebook users in the U.S. who have sent or received private Facebook messages that included a URL in the content of the message.

High-Tech Bridge wrote in August last year that Facebook was one of the Web services it tested that was caught scanning URLs despite such activity remaining undisclosed to the user, according to the complaint.

Facebook mined user data and profited by sharing the data with third parties such as advertisers, marketers, and other data aggregators, despite having made representations that “reflect the promise that only the sender and the recipient or recipients will be privy to the private message’s content, to the exclusion of any other party, including Facebook,” the complaint added.

The lawsuit is proposed to be consolidated with a similar one filed in January in the Northern District of California by another Facebook user David Shadpour. If there was a URL in the private message, Facebook searched the website identified in the URL for purposes such as data mining and user profiling, according to Shadpour’s complaint.

Facebook quietly shelved the practice, without acknowledging it, in October 2012 after a report in the Wall Street Journal exposed the scanning, according to Rochon Genova.

The class-action lawsuit in Ontario includes all Canadian resident Facebook users who sent or received private messages containing URLs up to October 2012. There are more than 18 million Facebook users in Canada and around three-quarters of them log on to Facebook at least once a day, according to Rochon Genova.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Software Defined Internet… So called “Net Nutrality” would kill that dream for good

image

In recent weeks, we’ve seen both American and European regulators offer more narrowly confined contexts for their principles of net neutrality. In Europe, lawmakers are eager to codify into law a new definition that allows service providers to innovate to deliver high-quality services, such as the class needed for high-definition videoconferencing, so long as they don’t degrade the quality of service for others. And in the U.S., FCC Chairman Tom Wheeler (albeit with some difficulty getting his message through to the press) drew distinctions between the interconnection or peering agreements that service and content providers may make (such as Comcast and Netflix), and the sacrifices that all providers must make to maintain a discrimination-free Internet.

If everyone ends up agreeing on this–or rather, to borrow a phrase from “The Simpsons,” “everyone whocounts“–then this calls into question the guiding principle that then-FCC Chairman Julius Genachowski cited as the basis of all Internet innovation since its very origin: “TCP/IP reflects a so-called ‘end-to-end’ system design, in which the routers in the middle of the network are not optimized toward the handling of any particular application, while network endpoints (the user’s computer or other communicating device) are expected to perform the functions necessary to support specific networked applications.”

It occurred to me that this flies right in the face of the concept of software-defined networking: the idea that the application may influence the schematic of the network over which it provides services. If this is a perfectly acceptable and even preferable model for corporate networks, why does it suddenly become verboten when applied in theory to the Internet at large–to the idea that applications that need more bandwidth, should be offered a path of least resistance?

In 2012, a team of researchers from UC Berkeley, and from the International Computer Science Institute affiliated with UC Berkeley, put forth publicly their concept of a software-defined Internet: essentially, one that decouples Internet architecture from Internet infrastructure.

In their preamble to “Making the Internet More Evolvable” (.pdf) the team states, “Some argue that we require a radically different architecture to enable evolution. To the contrary, we contend that a simple re-engineering of the basic Internet interfaces to make them more modular and extensible–as one would in any software system–is sufficient to produce a far more evolvable Internet.”

Because the infrastructure is so inflexible, these researchers argue, the architecture has become almost impossible to evolve–as evidenced by the still-ongoing transition to IPv6. For their solution, the Berkeley team would put Genachowski’s assertion to the ultimate test. Wiping the blackboard clean first, they then reassemble the context of the Internet’s data plane as comprised of a network core with an internal address scheme, and a network edge that employs software-based forwarding.

They then delegate the task of defining how packets are forwarded to the edge. This way, it becomes unnecessary for protocol to define the behavior of routers in the middle of the network, as this behavior is entirely anticipated and verified at the edge. Framed like this, the Berkeley/ICSI team’s network (which they code-named “Omega,” perhaps after the NSA left behind no remaining code-words for anyone else) appears to be the gold standard for Genachowski’s original vision, as embodied in the FCC’s Open Internet guidelines (which, for now, are suspended). It’s end-to-end design in real-time, where the routers are essentially slaves.

Then you come across the following passage in “Software-Defined Internet Architecture” (.pdf) [emphasis mine]:

“…One need not specify the forwarding behavior of each box beforehand because as long as two routers talk to the same controller they can be made to interoperate by the controller.  This allows us to take a top-down perspective, by which we mean that we focus not on what each box does individually but instead first look at how to decompose Internet service into well-defined tasks, and then consider how to implement those tasks in a modular fashion.”

Uh-oh. The eureka moment here happens when Berkeley’s expertise and Genachowski’s vision coalesce into something that Genachowski might never have anticipated. You see, if the edge is endowed with intelligence such that it can steer the direction of routing tasks according to application class, then even if the Internet as a whole is greatly improved, it is no longer neutral in any way, shape or form.

The danger in this will be highlighted at some point, despite the as-yet-unfathomably enormous potential benefits of a software-defined Internet. Someone will raise the specter of evil, and an advocacy group will declare it a conspiracy.

So let’s get it out of the way now, lest we lose the courage to discuss “Omega’s” potential. If software can define routes according to service class, then it will become feasible, and certainly tempting, for service providers to lock down those service classesto carve the maps for their premium Internet services in advance. Indeed, there may be valid engineering reasons for them to do so. But the business reasons will also be there, and they will be given the blanket designation of “innovation.”

If Comcast or something like it has the power to designate “fast-lanes” for exclusive content provider customers (itself included) entirely in software, then the only remaining reason why a government regulator should prohibit it from doing so is that it can establish artificially high prices for such premium service–whose eventual costs are passed down to consumers. If the Internet gets faster, this passing down will only happen sooner.

The FCC presently lacks the power to regulate commerce at this level. Perhaps another agency has that power. In any event, it will be up to Congress to make that determination, and to give that agency the authority and mandate. And right now, Congress is incapable of deciding the proper way to crack an egg. At some point, there will need to be an open panel of influential people with the intelligence and wherewithal to reason a way through this problem. And right now, there isn’t one. – Scott

Read more about: software-defined InternetFCC

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Sloppy Coding Blamed for “Heartbleed” Most Serious SSL Encryption Vulnerability Ever

image

“A mistake in C code” is responsible for one of the most serious bugs discovered in any implementation of SSL/TLS encryption: the OpenSSL open source toolkit, according to security engineers with Finland-based Codenomicon, in an e-mail to FierceEnterpriseCommunications Wednesday morning. More to the point, it’s a programming shortcut deeply embedded in OpenSSL’s C source code, which may be attributed to sloppy coding, but which history also indicates may be an omission intended to make code run faster.

“True, C as a language is infamous for buffer overflows and other memory handling mistakes,” says Codenomicon spokesperson Ari Takanen, “but there are reasons why C will still be used especially in operating systems and embedded devices for quite some time. Open SSL is an open-source library commonly used in all types of communications software. Good programming practices can help a long way, but people always make mistakes. That is why testing is still always needed.”

Here’s a real-world analogy for the situation: Imagine a pay telescope installed on the balcony of a public sightseeing hotspot. If there were no guard posts keeping the telescope from turning 360 degrees, someone could use that telescope to peek into private property. The same principle applies to unmanaged software code: If the matter of public and private boundaries for memory contents is left up to the programmer, then even if the programmer obeys her own rules, someone else using the same code might not.

Only in recent months have Codenomicon engineers Matti Kamunen, Antii Karjalainen and Riku Hietamäki been deploying a tool they call Defensics to check for bugs in commonly deployed security tools. OpenSSL is indeed quite common, responsible for more than 17 percent of the world’s site-signing certificates, according to two independent estimates by Netcraft and Datanyze (their numbers differ by a tenth of a percent). Though two-thirds of the world’s Web servers run either Apache or nginx servers known to employ OpenSSL, it’s a much safer estimate that about 7 sites out of every 40 may be susceptible to having their keys exposed, and later to be spoofed in man-in-the-middle attacks.

Takanen told FierceEnterpriseCommunications that the Codenomicon team discovered “Heartbleed” while working with Google Security’s Neel Mehta to improve a feature of their Defensics suite, called SafeGuard. When the feature was introduced last January, the company touted it as using automated analysis to reveal unusual or unexpected responses from software that are more subtle than outright crashes.

In this case, SafeGuard spotted unexpected responses from the “heartbeats” generated by relatively recent versions of OpenSSL. A “heartbeat” is a signal, sometimes generated at designated intervals, that a service uses on an asynchronous and unreliable transport mechanism (such as TCP/IP) to reassure a client service that it’s still present.

The Heartbeat Extension protocol was finalized by IETF in January 2011. It’s a mechanism for one party in a secure connection to ping the other (once their handshake process is complete), and receive an immediate response. It is an unsophisticated protocol, without so much as a regular rhythm. It simply resolves the problem of assuring both parties in a secure channel that they don’t need to renegotiate the terms of their session.

Apparently, OpenSSL versions 1.0.1 through 1.0.1f were originally written in C, but were compiled in such a way as to exclude the type of bounds checking that restricts memory pointers to secure ranges. Since it’s not managed code, the C language effectively permits programmers to keep track of their own arrays–a practice that leads to buffer overruns.

It could be sloppy programming. Or it could be an intentionally deployed wrapper (substitute code which adds or perhaps subtracts functionality) that handles the caching of memory differently. Last Tuesday,OpenBSD founder Theo de Raadt noted evidence of the existence of such a wrapper in OpenSSL code, from the comment lines appearing in a description for a macro command. These source code comments read in part: “On some platforms, malloc() performance is bad enough that you can’t just free() andmalloc() buffers all the time, so we need to use freelists from unused buffers.”

Translated, the developer is saying that the native performance of the memory allocation functionmalloc() and the allocated memory release function free()–both part of the standard C library libc–is so slow, that the only alternative is to create a pointer to a list of amalgamated clusters of memory–a freelist.

In a post to his OpenBSD mailing list Tuesday, de Raadt condemned this type of wrapper, as something he called an “exploit mitigation countermeasure.” Specifically, rather than use code designed to crash in the event of possible exploit (“then the bug can be analyzed, and fixed forever”), de Raadt says OpenSSL developers chose to implement a riskier alternative in hopes of reducing crashes and speeding up performance.

That alternative effectively enables the contents of the heartbeat packet to reveal any 64 KB piece of the server’s memory.  Though that’s not a lot in itself, simply repositioning the bounds and pinging again enables a scan of potentially everything, including the keys used to encrypt the current secure session. With those keys in hand, a malicious actor can easily spoof that server.

Despite the fact that Defensics SafeGuard revealed the existence of this latent bug, in his e-mail to us, Codenomicon’s Ari Takanen said there’s no absolute way for Web servers to determine whether or not they’ve been compromised. For that reason, they should have their current certificates reissued, and existing ones revoked.

Would this mass reissuance of certificates–in addition to Web servers patching their faulty OpenSSL implementations to version 1.0.1g–impact end users and their client systems in any way?

“If done correctly,” responded Takanen, “it should not cause any extra actions to everyday users. If, due to the urgency of the topic, some service providers issue self-signed certificates, then temporarily this can show to users. Those users who wish to check if the certificates have been updated can find the issue from behind the lock icon in their browser.”

For more: 
– read the full story on the Heartbleed bug from Codenomicon

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

San Francisco-based community-driven shopping app The Hunt has raised a $10 million

image

San Francisco-based community-driven shopping app The Hunt has raised a $10 million Series B round led by Khosla Ventureswith participation from Javelin Venture Partners.

The Hunt helps consumers find and purchase items inspired by photos on social networks, relying on its community of users to identify products and solve “hunts” that other users post.

Founded in 2013, The Hunt has raised around $16 million to date and currently sees 100,000 daily active users.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

San Francisco-based small business lending startup Fundbox has raised $17.5 million

image

San Francisco-based small business lending startup Fundbox has raised $17.5 million in a Series A funding led by Khosla Ventures with participation from SV AngelVikram PanditTom GlocerJay MandelbaumEmil Michael, and other investors.

Fundbox lends business owners the amount they are owed in client invoices, with low interest rates and rewards for early repayment, so that small businesses can continue paying their bills.

Operating in stealth since August 2013, Fundbox has signed up thousands of active users and clears tens of thousands of invoices daily.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

San Francisco-based cloud service marketplace AppDirect has raised $35

image

San Francisco-based cloud service marketplace AppDirect has raised $35 million in a Series C funding led by Mithril Capital Management with participation from previous investors iNovia Capital and Foundry Group.

AppDirect’s platform connects businesses with brands and developers to help them discover, buy, and manage cloud-based software and services.

Founded in 2009, AppDirect will use the new funds to expand its international presence and continue its product development.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Ingram Micro Cloud Summit; Smooch-Fest for more Cloud Market Hype

image

INGRAM MICRO CLOUD SUMMIT — Technology and partner economics took center stage – literally – on the final day of the Ingram Micro Cloud Summit 2014 event underway this week in Hollywood, Fla.

On Wednesday, Scott Collison, vice president of Hybrid Platform, VMware, told attendees that public cloud technology was unreliable and advised them to deliver hybrid cloud technology to their customers, instead — clouds with VMware software at the heart of them, in other words.

Public clouds, he said, have been optimized for only new applications and require separate management tools. VMware, he said, is “focused on fixing that.”

With Amazon Web Services (AWS), you’re essentially “jumping off a cliff naked into the dark,” he said. With VMware, technology, you can move your existing applications to the cloud, with seamless networking, common management and one place to call for support.

Before finishing, Collison said that VMware will unveil a new disaster recovery solution on April 14. The news will be important to partners, he added, because disaster recovery “is a gateway drug to the cloud for small-to-medium customers.”

Afer Collison spoke, Judy Smolski, vice president, midmarket, with IBM, made the case for Big Blue technology to the assembled Ingram Micro partners. The company, she noted, has spent spent $7 billion since 2007 on 16 acquisitions to get ready for this “new frontier.” Like Collison, she focused on the opportunities partners have to help customers adopt hybrid clouds.

Mike Fouts, vice president, Americas Channel at Citrix, meanwhile, pointed out that Gartner said Citrix in 2013 had one of most competitive portfolios for the cloud.

“We think there’s an immediate $1.5 billion opportunity for partners in Data as a Service (DaaS),” he said, adding that “Citrix was in cloud business before it cool to call it cloud.”

“We are on a journey to be the best software company in the world. With the best products, partners and solutions,” he said.

In addition to technology, partner economics also dominated discussions on the final day of the event. Gartner Vice President and Distinguished Analyst Tiffany Bova delivered a keynote address during which she challenged solution providers to transform their businesses more quickly. The reason certain vendors have increased their direct sales, she said afterward, is because partners are not moving quickly enough to embrace new market realities. These include the rise of cloud computing, of course, and the diminished control that traditional information technology (IT) departments have in many organizations.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Microsoft Shows off Video Game Performance Improvement via the Cloud

image

In what could be construed as a response to some of the framerate and resolution differences between the Xbox One and the PS4, Microsoft MSFT +1.61% took the stage at the San Francisco Build Conference to demonstrate how cloud computing technology can enhance video game graphics. This is the sort of thing that Microsoft has been talking about since the Xbox One reveal, but we’ve yet to get much of an idea of how it will work when the rubber hits the road.

They showed a demonstration of two high-end gaming machines, one of which is connected to Azure’s cloud server, one of which isn’t. When the Microsoft presenter starts loading the scenario up with some complex physics, the unconnected machine struggles to maintain framerate while the connected one clips along at 32 fps. It should be noted that this is not Xbox One footage, but rather a PC prototype.

We’ve already seen the cloud start to influence Microsoft’s Xbox One with Titanfall, which offloads some of its computational work to remote servers to get the game running more smoothly. The key is that while actual graphics are best handled on a local processor, other tasks, notably AI, can run remotely to free up more of the local machinery to focus on graphics. In the video, the cloud is handling tasks related to object rotation and physics.

This is one of those tricky technologies that can be hard to peg. Titanfall makes a convenient test case, because anyone playing it needs an internet connection anyway. But how this will extend past the realm of online games remains a question. From one perspective, the focus on cloud computing reminds us that the initially controversial concept of an “always on, always connected” console is still very much in Microsoft’s sights. We can’t make use of the Cloud as a way to supercharge single-player games without a constant internet connection, and that means any game built from the ground up to take advantage of this technology would be totally unplayable offline.

From Diablo 3 to SimCity, we already have plenty of examples of the pitfalls of online-only single player.

Cloud computing clearly has a place in the future of video game consoles, but I’m skeptical as to how central it will be. It may work well for things like Sony’s Playstation Now service, but gamers have already been vocal about their distaste for tying single-player games to distant servers.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

IBM claims new patent for mobile security technology

image

IBM has come up with a technology for reducing the risk of data being exposed in mobile push notifications to mobile devices by coming up with a way to encrypt that information so service providers and others can’t actually see any data related to the user’s mobile device.

IBM has just received the patent for its technology, U.S. Patent #8,634,810, “Pushing secure notifications to mobile computing devices,” which was invented at IBM Labs by Benjamin Fletcher, software engineering researcher. Caleb Barlow, IBM director of application data and mobile security, says the patented technology is based on the idea of a cloud-based service that lets developers create applications that can encrypt data notifications via unique message identifiers in the cloud that is then securely transmitted to a mobile device via a third-party service provider.

When the end user’s device authorizes the message, the encrypted message content is pushed down from the cloud.

“With this patent, we’re bypassing the push notification and pass a token to the app,” explains Barlow. “You give permission to get notifications.” These notifications could be any kind of update for the apps on the user’s device.

The purpose of the push-notification bypass with encrypted transport is to prevent personal data from being exposed on carrier networks. As an example, IBM points out it could be used by a credit-card company notifying a customer of suspicious account activity.
Barlow didn’t say exactly how and when this new patented technology might roll out as a commercial service or toolkits in the future, but he said it’s part of a larger mobile security strategy at IBM that also includes the acquisitions of Fiberlink and Trusteer. “It’s one of many things we’re working on now,” he added.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Top-paying industries for IT 2014

image

Some IT professionals fared significantly better than others this year based on the industry in which they work. Somewhat surprisingly, the Computerworld 2014 Salary Survey respondents who reported the biggest pay increases work for nonprofit organizations, where total compensation — salary plus bonus — was up 4.4% from the previous year.

Other industries that saw strong increases in total compensation include telecommunications (up 4.3%), entertainment/marketing/advertising (up 4.1%), mining/agriculture/construction/engineering (up 3.4%) and legal/insurance/real estate (up 3.3%).

Read the full report: Computerworld IT Salary Survey 2014

Growth in bonuses was highest for IT pros who work in the legal/insurance/real estate industries, with an average 9% increase, and in telecommunications, with an average 8.6% increase. The industries that saw the largest decreases in bonus payments were education (down 9.7%) and health/medical services (down 7.1%). No industries saw a decrease in total compensation or in salary year-over-year.

Here’s a look at the total compensation of a sampling of IT job titles by industry:

Computer services/consulting

    CIO/vice president of IT: $148,500*
    IT manager: $123,262*
    Software engineer: $103,079
    Application developer: $75,289*
    Systems administrator: $73,567*
    Help desk/tech support specialist: $45,052

Education

    CIO/vice president of IT: $168,133*
    Director of IT: $101,315
    IT manager: $75,313
    Network engineer/architect: $71,129*
    Systems administrator: $59,948
    Help desk/tech support specialist: $48,473

Government

    Director of IT: $105,805*
    IT manager: $100,049
    Technology/business systems analyst: $83,635*
    Systems administrator: $80,468*
    Programmer/analyst: $79,573*
    Help desk/tech support specialist: $58,545*

Healthcare

    CIO/vice president of IT: $175,829*
    Director of IT: $113,598
    IT manager: $90,177
    Systems administrator: $71,824
    Technology/business systems analyst: $71,018*
    Help desk/tech support specialist: $51,030*

Legal & insurance

    CIO/vice president of IT: $226,206**
    Director of IT: $132,715**
    IT manager: $102,760*
    Programmer/analyst: $93,811**
    Systems administrator: $85,329**
    Help desk/tech support specialist: $55,499**

Manufacturing

    CIO/vice president of IT: $197,781*
    Director of IT: $132,052*
    IT manager: $99,050
    Systems administrator: $73,467*
    Network administrator: $63,078*
    Help desk/tech support specialist: $56,052*

  • More than 15 responses but fewer than 30** 15 or fewer responses

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

San Francisco-based digital diabetes prevention program Omada Health has raised $23 million in a Series B round led by Andreessen Horowitz

image

San Francisco-based digital diabetes prevention program Omada Health has raised $23 million in a Series B round led by Andreessen Horowitz with participation from Kaiser Permanente Ventures and previous investors U.S. Venture Partners and The Vertical Group.

Omada Health creates digital health therapy programs that are covered by insurance providers, starting with its 16-week web-based treatment “Prevent” aimed at addressing prediabetes in adults.

Founded in 2011, Omada will put the funds toward product development and doubling its headcount by the end of the year.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

San Francisco-based mobile app performance management solution Crittercism has raised $30 million

image

San Francisco-based mobile app performance management solution Crittercism has raised $30 million in a Series C funding led by Scale Venture Partners with participation from InterWest Partners,VMware, and Accenture.

Crittercism operates a platform for companies to monitor mobile app performance and provides a real-time global view of app diagnostics.

Founded in 2011 as an  AngelPad company, Crittercism will use the new funds to invest in enterprise initiatives and expand its presence in South America, Europe, and Asia-Pacific.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Mountain View-based Q&A site Quora has raised $80 million

image

Mountain View-based Q&A site Quora has raised $80 million in a Series C funding led by Tiger Global with participation from existing investors Benchmark,MatrixNorthbridge, and Peter Thiel.

Quora is a collection of questions and answers created, edited, and organized by its users.

Founded in 2009, Quora has raised over $150 to date and will use the new cash to fund international expansion and product improvement.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

VCE Forms Foundation for New Cloud Service Provider “Skyscape”

image

To rapidly acquire new customers and be eligible for government procurements, Skyscape Cloud Services needed a converged infrastructure platform with pre-integrated compute, networking and storage to form a foundation for its multi-tenanted cloud solutions.

VCE enabled Skyscape to satisfy the most demanding requirements for security, sustainability and compliance and win significant new business in just 10 months.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

This Satisfying Browser Game Lets You Slap Joffrey From “Game Of Thrones” Over And Over Again

image

When it comes to “love to hate” characters on television, everybody not named “King Joffrey” is playing for second place. There are anti-heroes, sociopaths, and outright monsters stalking the televisual streets of everywhere from Woodbury (both undead and still-breathing) to Pawnee (“You got Jammed!”), but there’s no one as compellingly vile on television as Jack Gleeson’s sneering adolescent king on Game of Thrones. This is a character that the show’s creators have recognized requires the occasional scene in which Peter Dinklage slaps him, just to provide the audience some form of relief.

image

The vicarious thrill of watching Dinklage (or occasionally Lena Headey, who plays the character’s mother) slap Joffrey is an important part of the Game Of Thrones experience, but the creators of the browser-based game Kingslapper (warning, some installation may be required) have figured out that there’s nothing more satisfying than doing it yourself: Controlling a Tyrion Lannister avatar, you swipe in from the right to slap Joffrey’s stupid face. Players get six tries, and are awarded points based on the strength of their slaps. As games go, the play is perhaps a bit simple, but given that every single swipe brings a fresh strike to the face of the King of Creeps, who cares?

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Google Glass vs. Submachine Gun: Not Even Close To A Fair Fight

image

Have you ever wondered what would happen if you shot Google Glass with a submachine gun? Of course you have! Since not everyone has the means to do this, though, Rick Ryder has posted a YouTube video (seen below) of him shooting a set of Glass with not one, but two KRISS Vector submachine guns. Ryder’s video actually consists of 5 tests. The first three (grass drop, concrete drop, and water drop) are pretty standard and (not surprisingly) pretty boring. Glass falls. Glass is picked up. Glass is fine. All include slow-mo shots. The next two tests, however, are significantly more interesting.

Ryder breaks out the big guns (literally) for the final two tests. For clarity, he’s using the KRISS Vector SBR .45 ACP. This is the semi-automatic version of the Vector Submachine Gun, and for those of you wondering,  it’s legal in the US. Bottom line: it’s a powerful gun. For the first of the gun tests, Ryder just uses one of the Vectors. After placing the Glass on a homemade stand, he does what any person would do. He shoots it. In real-time, all you see is wood and pieces of the Glass flying around. This is where the slow-mo comes in handy. While most of the bullets hit the stand or miss everything, the first shot does hit the Glass and shatters the right earpiece. This certainly isn’t enough though.

The second and final gun test results in the most carnage. Ryder shoots two Vectors simultaneously in a wonderful display of bullets and casings. Again though, real-time isn’t very helpful. Bits of glass and wood flying around is all that’s seen. Video technology saves the day again with slow-mo, and the destruction is seen. The first shot hits again, and this time, there’s no hope for the Glass. As you can imagine, Ryder had a hard time finding all of the pieces to the Glass. The actual lenses were never seen again after the second round of gunfire, but he was able to gather most of the frame. He also added a little humor at the end of the video. Check it out. It’s a good watch.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Fred Bambino’s upcoming concept art book Dark Shepherd: The Art of Fred Gambino,

image

Every time she returned home, she was tempted to swing past the space station and land on the broken moon and plumb its ancient caverns. Perhaps the next time she had a week’s worth of holiday chits, she’d blow it on an amateur excavation.

This image is “Dark Shepherd Moon” and it comes from concept artist Fred Bambino’s upcoming concept art book Dark Shepherd: The Art of Fred Gambino, which is available for pre-order on Amazon. The book is part retrospective and part concept art book as Gambino revisits his career as a concept artist (most recently on Guardians of the Galaxy) and illustrates scenes from his own ongoing story Dark Shepherd.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Yoshi takes a stand in Super Smash Bros. Wii U and 3DS

image

The Super Smash Bros. character roster couldn’t just rejoice in the return of Zero Suit Samus and Sheik, with series creator Masahiro Sakurai also sharing that Yoshi is back.

As a returning fighter he’s undergone a more significant revision, no longer standing on two legs with a bent back and now standing upright. Sakurai stated that such change “has made Yoshi even stronger”, so we certainly can’t wait to take him for a spin.

Charizard and Greninja were also announced as new playable characters toward the end of the broadcast, increasing the number of Pokémon brawlers.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

The California Global Warming Hoax Credit Scam: How they’re Raping Entrepreneurs for Millions instead of Learning to Budget their own Money

image

Could the California Air Resources Board (CARB) be taking a $55-million bite out of Tesla Motors’ profits? The state regulator, which grants zero-emission vehicle (ZEV) credits for automakers making plug-in vehicles, is planning to reduce the number of credits generated by each Model S battery-electric sedan from seven to four, Bloomberg News reports. That means the California-based automaker will have fewer credits to sell to big buyers such as General Motors and Chrysler, who don’t make enough ZEVs on their own to comply with state mandates.

While the selling price for these credits isn’t disclosed (they’re private transactions), the market was a lucrative one for Tesla, which generated $129.8 million in revenue from California zero-emissions credit sales and about another $65 million selling US Corporate Average Fuel Economy (CAFE) credits last year. All told, California and federal zero-emissions credit sales accounted for about 10 percent of Tesla’s sales last year. A Tesla representative didn’t immediately respond to a request from AutoblogGreen for comment.

This issue first came up last year when CARB hinted that it wouldn’t give Tesla credit for having a battery-swapping option as it’s method for quick-fueling compliance. Tesla, which appears to have been preparing for just this scenario, has been collecting revenue on credits since 2010 and achieved its first-ever profitable quarter in the first quarter of 2013 because of such credits.

While the maximum number of zero-emissions credits a vehicle could garner was increased from seven to nine in the new rules, Tesla can’t take advantage of that because it meets neither of the most stringent criteria: that the car in question is rated to go more than 300 miles on a full tank or battery and be able to be “filled up” (or fully charged, in this case) within 15 minutes. Those are more hydrogen fuel-cell-like targets, but Tesla has the EVs that come closest to meeting them.

The sad part of all of this, is the fact that carbon dioxide is something we all need to survive and the “Global Warming Hoax”  was already debunked two years ago when emails were shown with all the data being manipulated and doctored at the UK facilities where all the so-called “computer models” were being generated to “prove their theory” so that Algore could reap his billions via HIS global warming credit selling business that he and others invested hundreds of millions on expecting to push their legislation through Congress which flopped even after winning a Nobel prize for doing absolutely zero except attempting to scam the American people and businesses out of billions.

So now California is doing it? Let me guess…. Barbara Boxer and The former speaker Feinstein have been spending time with ol Algore…..

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Land Rover reveals ‘see-through’ bonnet for Discovery Vision concept

image

The new Land Rover Discovery Vision concept will mark the debut of Land Rover’s radical ‘see-through bonnet’, which uses augmented reality technology to give the driver a view of what is underneath and in front of the car.

This innovation could make driving off road easier because the driver can not only see the terrain ahead but can also track the position of the front wheels.

The key is a new type of ‘smart’ windscreen, which can display a full-width computer-generated image, delivered by cameras mounted in the car’s grille. This is a big leap from today’s head-up displays, which are restricted to a tiny portion of the driver’s field of view.

Wolfgang Epple, director of R&D at Jaguar Land Rover, said the transparent bonnet would also be highly useful in urban conditions.

Land Rover’s Discovery Vision concept will preview the firm’s new family of Discovery vehicles, which are due to arrive in 2015.

The new family, described as a collection of “premium leisure SUVS”, will feature a bold new design first seen in production form on the replacement for the current Freelander.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Open Source Cloud Continues to be Plagued with Problems due to lack of Standards

image

LAS VEGAS — IT pros representing large organizations say the best way to do cloud at scale is to do it yourself using open source tools.

That was a common theme running through presentations here at the Cloud Connect Summit in which cloud architects representing companies including Warner Music Group, the U.S. federal government, Target and PayPal, presented on their experiences using custom tools to create large cloud computing environments.

“Git is our toolbox,” said Jonathan Murray, executive vice president and chief technology officer of Warner Music Group. “If you need a solution that doesn’t exist, look on GitHub.”

Open source software for running clouds available on GitHub includes Netflix OSS, a toolkit that addresses availability, cloud management, infrastructure services and developer productivity, among other things. Another big fish in the cloud, IBM, is one of the biggest adopters of Netflix OSS, according to a presentation by Adrian Cockcroft, formerly of Netflix, now a technology fellow at Battery Ventures. 

In fact, scratch the surface of many proprietary vendor tools and you’ll find open source underpinnings, said Mayuresh Shintre, cloud platform architect for the Target retail chain.

Shintre expressed dissatisfaction with proprietary cloud management platforms that aim to manage multiple clouds, saying they reduced the functionality available in cloud-native application programming interfaces (APIs) to about 30% of the original feature set.

“There is a tradeoff,” Shintre said. Enterprise cloud architects can let end users go cloud-native via APIs and get full cloud functionality and freedom to experiment, but that leaves aside governance and risk management, he said.

“The higher up you go [in abstraction layers] you gain some benefit, but then you end up compromising the ability for a developer to fully harness the native cloud feature sets,” he added.

Many proprietary cloud management platforms that purport to be multi-cloud today are also Amazon Web Services-centric, Shintre said.

Time to value, or the time it takes for IT to create cloud architectures of use to the business, is both a crucial and difficult component of delivering clouds at high scale, said Warner Music’s Murray.

“With any vendor that turns up at your door and tells you they have the solution to the time-to-value problem, there’s one of three things going on,” he said. “The first thing is, they don’t really understand the problem, the second thing is that they may understand the problem, but in order to solve the problem you’ve got to buy in only to their view of the world, or basically they’re lying. Take your pick.”

The trouble with open source cloud — lack of standards

Customizable, open source software is all well and good for companies willing to do the work, but one consultant working for the U.S. federal government found a mess when he tried to find industry standards around cloud for his organization to implement.

There are about 65 emerging standards meant to address cloud computing, according to Michael Biddick, CEO of Fusion PPT, a strategy and technology consulting firm based in Vienna, Va.

Within that, “there’s a whole universe of Internet technology and standards that can be applied to interoperability and portability,” which were priorities for the government as it explored cloud computing.

“Our customers were hoping for a universal portability mechanism,” he said. But such a tool does not exist.

“If you are a believer in standards, it is incumbent upon you to try to move vendors and service providers in that direction,” he told IT pros here. “There is no silver bullet.”

Another issue when it comes to open source software is finding the talent to work with it, which all the speakers acknowledged is a big problem in the market today.

Observers watching the presentations also asked how smaller companies could take advantage of the benefits these larger players have realized through customization of open source utilities.

That’s where OpenStack comes in, according to Scott Carlson, cloud infrastructure architect for PayPal.

Most of the vendors in the IT infrastructure industry have jumped on the OpenStack bandwagon, and are “really, legitimately trying hard … we don’t even have to ask any more if they support OpenStack,” Carlson said. “You can find somebody to work with you.”

Two representatives of a relatively small government contractor in the audience said they balance cloud development between VMware Inc.’s vCloud stack and an OpenStack cluster in the research phase.

“Open source tools have developed a lot in the last couple years,” said one of the contractors, who requested anonymity. “But there’s a flexibility versus time-to-value tradeoff.”

Beth Pariseau is senior news writer for SearchCloudComputing.com. Write to her atbpariseau@techtarget.com or follow @PariseauTT on Twitter.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

U.S. hits H-1B cap with ‘high number’ of petitions

image

The U.S. government said today it has reached the H-1B cap, and if this year is similar to previous years, 70% of applicants are under the age of 35, and a major portion will take jobs at offshore outsourcing companies.

The U.S. Citizenship and Immigration Service (USCIS) said it received a “high number” of H-1B petitions, but was unable to give a final tally because it was still counting them.

The fact that the U.S. is still processing the applications may indicate that it has received more H-1B visas than last year. There were124,000 H-1B visa petitions submitted in calendar year 2013 (last April) for use in the 2014 fiscal year.

The H-1B program has a regular, or base, cap of 65,000, and another 20,000 visas for those who earned a master’s degree or higher in the U.S. Both caps were met.

Because the number of petitions exceeded the cap, the U.S. distributes them via a computer-generated lottery.

The U.S. begins accepting H-1B petition for the next fiscal year on April 1. The 2015 fiscal year begins Oct. 1.

High demand for H-1B visas was widely expected and follows gains in tech hiring. Approximately 60% of all H-1B petitions approved go to people in computer-related occupations.

Other types of professions receiving H-1B visas include physicians, teachers and professors, accountants, and other professions requiring degrees.

A majority of the H-1B visas in the computer-related occupations will go to IT services companies, who are also the major users of H-1B visas.

The IT industry is lobbying hard to increase the H-1B cap and frames it as an issue for retaining foreign nationals who graduate from U.S. schools. However, the data shows that the largest users of H-1B visas operate offshore operations. The top three H-1B users, Infosys, Tata Consultancy Services, and Cognizant, accounted for 27% of the 65,000 H-1B cap petitions.

David Foote, who heads the labor research group Foote Partners, estimated that IT jobs increased128,500 last year, but most of that hiring was in IT services segments.

H-1B visa holders are relatively young. Approximately 70% of H-1B petitions approved are for workers between the ages of 25 and 34, according to the most recent USCIS profile data.

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

Small to Medium Enterprise Spending set to increase drastically

image

SME (small to medium enterprise) spending on IT equipment is set to increase across Europe as customers look to update their infrastructure and ensure that the back office can keep pace with changes in other parts of the organisation.

Evidence of increased spending comes from the latest GE Capital’s SME Capex Barometer, which shows that firms in the UK, Germany, France and Italy are set to spend €73bn on IT equipment in the next year, which would be a 15% increase on 2013.

The UK is leading the field with a 56% improvement on the last 12 months with €21.5bn expected to be spent with hardware taking the largest share of capital expenditure with billions earmarked for upgrades of laptops, servers and other devices.

Although spending on software is expected to be lower across Europe at €31bn the largest amount of action will be coming from the UK, which GE Capital is predicting will spend €9.4bn.

“After several years of prioritising spending on manufacturing equipment assets, SMEs now look to be increasing IT and office equipment capex, potentially in order to update their infrastructure and back office systems to match modernisation efforts at the front end,” said Christian Bernhard, equipment finance leader at GE Capital International.

“Given the productivity gains, cost efficiencies and competitive advantage associated with up-to-date technologies, SMEs that increase investment in upgrading IT equipment will be strongly positioned for future growth,” he added.

At the same time a survey from Deloitte indicated that those responsible for spending at British firms were feeling a lot more positive about taking risks as uncertainty about market conditions continued to decline.

As well as spending the financial specialists found that 81% of the CFOs surveyed were also expecting to hire more staff in the 12 months ahead and levels of spending were returning to 2007 levels

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

PaaS Another Dumb Marketing Acronym causing mass confusion

image

“Platform as a Service” if that is not about the most ambiguous description of something you’ve ever heard then I don’t know what is…. I’ve always thought it was a stupid name for the third (Cloud market, segment, platform, uuuhh sector?) whatchamajigger of Cloud Computing. I’ve always had in my own mind what it means and being someone who has been designing and building Cloud Infrastructures (over 20 since 2004) I believe I know what I’m talking about to some extent however, then about a year ago I began to see stories from analysts (so-called experts who’ve never designed nor built one in their life) telling me it means something completely different, then I decided to examine some supposed actual PaaS services and was completely dumbfounded with how cheesy they were, it has not what I had envisioned, not even close….

So I’ve decided to develop my own PaaS solution and hopefully show everyone how it should be done, what it “was originally meant to be”.

So look for the “real” definition of a PaaS service to launch later this year from a company which is already in the process of launching six new disruptive and cutting-edge Startups this year alone. The stealth-mode company “Synapse Synergy Group” a cutting edge technology think-tank which has been able to acquire some of the most brilliant developer talent from all over the world including Poland, South Africa, UK, Germany, Spain and the US.

In a panel on “The Future of PaaS in an IaaS World” at Cloud Connect Summit, colocated with UBM Tech’s Interop Las Vegas, there was a surprising amount of disagreement on how to define platform-as-a-service as a form of cloud computing. Each member of the panel, which included several well-known cloud spokesmen, had a different definition.

Mark Russinovich, a technical fellow on the Microsoft Azure team, said he sees PaaS as “writing code that is integrated with a runtime environment, as opposed to code that is dropped into a virtual machine that’s sitting on a bare-metal server, a legacy kind of server. That’s the key differentiator point. The software knows something about the environment it’s running in.”

Margaret Dawson, HP’s cloud evangelist and VP of product management, claimed: “It’s really about that full environment for application development all the way through full, lifecycle management, even some of the orchestration stuff. It’s about a full environment, not only for development of the application. To me, it adds a layer above IaaS.”

Jesse Proudman, founder and CEO of Blue Box Group, a hosting service that, among other things, provides developer services and manages large-scale Ruby applications for customers, said: “For me PaaS is really about the service catalogue — consumable types of services — whether it be application delivery or container service. It’s that abstraction that delivers the ability to move workloads from cloud to cloud. I think that’s one of the most powerful features of PaaS technology in the market today.”

[Want to learn more about the debate over the future of PaaS? See Cloud Crossroads: Which Way PaaS?]

Brent Smithurst, VP of product management at ActiveState, supplier of Stackato PaaS software, said Stackato is “a platform-as-a-service based on Cloud Foundry and our primary market is Fortune 500 enterprises who use the platform in-house, on-premises. We’ve actually tried to get away from calling it PaaS. We really just call it an application platform.”

Krishnan Subramanian, director of Red Hat’s OpenShift platform strategy, said that, in addition to Linux containerization and open source tools, “I have a simple definition for PaaS. The application scales with the platform. It scales with the infrastructure seamlessly.”

So cloud platform-as-a-service, according to the PaaS experts, is a platform where the software knows about its environment in which it’s running. It’s also full application lifecycle management, from development through deployment and its production life. It’s also a catalogue of application services. It’s also an “application platform” and it’s a platform that can scale with the application seamlessly. Is that clear?

Proudman listened to the definitions and inserted an additional thought: “I really believe PaaS as a technology stack focuses on application delivery; it goes beyond just packaging up applications or services and really needs to provide a full orchestration chain to deliver those applications.” This comment makes deployment a more important part of PaaS.

Dawson also added a thought on why she continues to see PaaS as a distinct cloud layer separate from IaaS. “One reason that it doesn’t become part of IaaS is you’ve got to be able to have application portability. If it’s just tied to one type of IaaS, then you don’t have that portability.”

Red Hat’s Subramanian, however, disagreed. “I don’t think it’s just application portability… It’s application portability and portability of application environments.” That is, all the things that the application needs to run — its database interface, middleware, and security policies — need to become

What is sad is that people use these Marketing acronyms without ever even knowing what they mean all the time. I wrote an article many years ago about the “Mass Marketing Addiction to Acronyms” as I’ve watched it take perfectly good terms such as wireless and try to confuse people with “wifi” or ASP with SaaS, or virtual server with VPS, it never ends I just heard another one the other day which was so stupid I railed about it for five minutes to a good friend who used it and forced me to have to ask him what the hell he was talking about…. It was another marketing acronym for something else that had a perfectly good name already, but my point is that the Marketing Junkies have created this mess and we should all stop propagating their garbage unless we actually know what we’re saying and why…. If there’s already a good name for something, refuse to propagate their confusion, I make it my small part to not use stupid acronyms just because “it’s the new thing” to call wireless, wifi I don’t know about you but wireless is much more descriptive and a much better term, and zip think I’ll continue to use it, thank you very much indeed…

By Jarrett Neil Ridlinghafer 
CTO of the following –
4DHealthware.com
Synapse Synergy Group
EinDrive.com
HTML5Deck.com
PerfectCapacity.com
CSPComply.com
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Atheneum-Partners
Hadoop Magazine
BrainBench.com
Cloud Consulting International

PaaS Another Dumb Marketing Acronym causing mass confusion

image

“Platform as a Service” if that is not about the most ambiguous description of something you’ve ever heard then I don’t know what is…. I’ve always thought it was a stupid name for the third (Cloud market, segment, platform, uuuhh sector?) whatchamajigger of Cloud Computing. I’ve always had in my own mind what it means and being someone who has been designing and building Cloud Infrastructures (over 20 since 2004) I believe I know what I’m talking about to some extent however, then about a year ago I began to see stories from analysts (so-called experts who’ve never designed nor built one in their life) telling me it means something completely different, then I decided to examine some supposed actual PaaS services and was completely dumbfounded with how cheesy they were, it has not what I had envisioned, not even close….

So I’ve decided to develop my own PaaS solution and hopefully show everyone how it should be done, what it “was originally meant to be”.

So look for the “real” definition of a PaaS service to launch later this year from a company which is already in the process of launching six new disruptive and cutting-edge Startups this year alone. The stealth-mode company “Synapse Synergy Group” a cutting edge technology think-tank which has been able to acquire some of the most brilliant developer talent from all over the world including Poland, South Africa, UK, Germany, Spain and the US.

In a panel on “The Future of PaaS in an IaaS World” at Cloud Connect Summit, colocated with UBM Tech’s Interop Las Vegas, there was a surprising amount of disagreement on how to define platform-as-a-service as a form of cloud computing. Each member of the panel, which included several well-known cloud spokesmen, had a different definition.

Mark Russinovich, a technical fellow on the Microsoft Azure team, said he sees PaaS as “writing code that is integrated with a runtime environment, as opposed to code that is dropped into a virtual machine that’s sitting on a bare-metal server, a legacy kind of server. That’s the key differentiator point. The software knows something about the environment itR