Marist College, which is located on the east bank of the Hudson River in Poughkeepsie, New York, is well known for its use of technology, having been identified recently by Forbes and The Princeton Review as one of the top 25 most connected campuses in the US.
Thirsk says technology also plays a critical role in training the next generation of datacentre managers, data scientists, and enterprise IT’ specialists at the University.
“The technology has always been geared towards using and teaching enterprise-scale architectures and systems because we’ve always viewed that element as the most important part of technology: doing things at scale,” Thirsk says.
Whereas most universities are looking to move away from managing infrastructure internally amidst slimming budgets, Marist built and continues to operate its own cloud platform. It runs zLinux on IBM mainframes, where it hosts all of the university’s internal systems (EPR, HR) and e-learning software.
The university operates an academic community cloud, a stack of software services that it provides to non-profit organisations and for-profit companies, built on the same platform as that which hosts its internal systems.
It also uses its own datacentre as a training site for students.
“We think there’s going to be a bubble in higher education much like real estate,” Thirsk says.
“Parents and students aren’t going to keep paying the high price for higher education without seeing better results, and a lot of schools are looking at their cost structure and deciding that managing technology is not a core discipline for them.
“But given what we teach we decided it is core to us, not just for hosting our own stuff but getting the whole organisation involved in becoming a cloud provider,” he says. “The best way to learn is to do.”
It also offers a curriculum on open source enterprise computing, and has an incubator called the Cloud Computing and Analytics Centre, which deals in Big Data analytics and focuses on certifying users on business intelligence, for which there is huge market demand (Thirsk says the university runs at nearly 100 per cent placement in this field).
It also works closely with a number of Fortune 500 companies to fine tune its curricula and, in some cases, commercialise student-developed solutions.
“When our students do a project, they aren’t just going to do a science project and leave it on the shelf, they’ll actually watch from inception all the way through to commercialisation, the work that they’ve done. It also helps guarantee them a job.”
About 80 to 90 per cent of the course work is conducted online using a suite of collaboration tools, platforms, and online resources, and the university even developed a desktop virtualisation platform for distance learning, which allows students to login from all over the world to access massive data sets.
Given some of these courses teach how to set up Big Data projects, creating a virtualised multi-tenanted environment hat could link students across the world and enable them to work with massive volumes of data without much latency is a huge feat.
“What we did was consolidate cloud delivery of not just big data sets and Cognos on the Z or the open source tools from the mainframe and Linux side, but we put Windows 7 images on our servers that were delivered as a desktop image so the student didn’t have to load anything on their machine,” he says.
“They were served all the applications, modelling, scripting, data, and they seamlessly connected to the huge data sets on the mainframe end without them knowing that they’ve actually crossed a boundary from a desktop to a modern large-scale machine.”
Thrisk explains that creating what was effectively a desktop as a service platform was very difficult, particularly considering the volume of users and the distances between end users and the core infrastructure. The university engaged with the Apache Open Lab Project and tweaked it with sequence scripting in such a way that dynamically reallocates the student seat on the platform once a student is done with their session.
“Beyond scripting and tweaking the hardest thing to do, quite frankly, was the licensing. Microsoft hadn’t seen this before where you have multiple seats sitting on a single CPU with 25 simultaneous users, and it took them a couple of months for them to agree to license this for us,” he says. “It’s a new model for them and they are so big, it’s hard for them to work nimbly I suspect.”
Big Data to improve student performance
The university doesn’t just teach its students how to implement and use Big Data in their careers. It uses an analytics platform it developed called Learning Analytics that can determine within a two per cent margin of error how well a student is going to do in a course within two weeks – the likelihood a student will receive a B or above, a C or below, and the like.
Because so much of the learning process at the University takes place online the school can collect lots of information about student engagement with curriculums. It pools everything from clicks, time spent on pages and online resources to frequency of questioning in order to allocate each student with an aggregate level of engagement, which is then compared with class averages.
The university had so much success with the platform that it open sourced it and gave it to five other (very different) academic institutions.
“We didn’t know if this model would work in a small community college as well as it would in a big university, and in fact it does,” he says.
But the university’s experience with Big Data highlights what many, in flocking to these technologies, inevitably experience: you still have to know how to interpret that data, and what to do with it.
“If you don’t have a subject matter expert to handle big data, for your company or organisation, it’s a complete disaster.”
“When we first developed and used Learning Analytics and saw students failing, we simply sent students a message telling them they’re failing. And most of them just dropped the course. It was only when a faculty member came to us and told us we were chasing students away that we went and spoke to teaching experts, who told us we needed to find out why they’re failing, give them support to resolve those issues, and reengage them with the curriculum.”
The university gave the platform to a community college in a very economically depressed area; the school had a very low rate of completion of courses. When Marist looked at the analytics and talked to education specialists, it found the students were too poor to buy the textbooks.
“These students were winging it instead of using the materials because they just couldn’t afford them. So we put open source textbooks within their courses, and their performance went through the roof,” Thirsk says.
“There are so many things that go into a person declining in their behaviour academically – a breakup, family issue, money issue, and we had to learn how to most appropriately intervene so that they don’t give up, but so they continue and indeed go on to succeed. If anything using Big Data to improve learning outcomes taught us some things about teaching – particularly us on the IT side of things – that we didn’t know.”
“If you want to turn data into success, you need to know what that data means and how to work with it.”
By Jarrett Neil Ridlinghafer
CTO of the following –
Synapse Synergy Group
Chief Technology Analyst, Author & Consultant
Compass Solutions, LLC
Cloud Consulting International