12/01/2021 | Lab Innovation

Smart Labs

Laboratories lie at the heart of scientific change. And when you have the most conservative of institutions analysing billions of samples per day, change is naturally slow – but effects will be dramatic.

Laboratories are not known for their speed when it comes to implementing change. While technological innovation is something we can see rolling out constantly in environments such as consumer electronics, the chemistry lab, probably in keeping with the precise nature of the work that takes place there, takes a far more cautious route. But there are those who believe that may be about to change, at least in part, given the emergence of new technology focused exclusively on making the cutting edge of medical research even more efficient, something aided in no small part by those within the ACHEMA community.

The growing adoption of the likes of voice recognition, augmented and mixed reality and the use of collaborative robotic technology, as well as Artificial Intelligence, raises as many questions as it answers: those as far-reaching as whether scientists still need to be physically present at all?

Such developments were discussed at a recent Pulse Live Event where the question of the pace of change was put to experts who agreed the slow pace of change to date all came down to time and money in a sector driven by lengthy sign-off processes and a huge sense of obligation in terms of safety. Denis Ozdemir, head of customer success at LabTwin, the firm behind the first voice-powered digital lab assistant, said one of the biggest indicators of lack of progress within the laboratory environment is the sheer amount of physical documentation present, something totally at odds in a rapidly-advancing digitalised world. “If you dig out the pharma company in 1,000 years you might think it’s a paper company because there is so much of it,” he said, adding: “We help customers directly and contemporaneously capture the data and bring it directly into their sheets, their ELMS and their LIM systems so you don’t have to give that extra effort any more.” He noted that one advantage has been the “huge advances” in speech recognition software since the days when they mispronounced, replaced and even misunderstood words entirely.

“If you look at something like Google Stoop you are at human-level accuracy so a human standing next to you could not understand it better than the machine,” he said. “What we realised is that these general purpose speech-to-text technologies don’t have sufficient accuracy. You say pipad, they say ipad, because ipad is the more common word.”

Closer to home, he explained: “We have a team of data scientists at LabTwin and we continuously train with the scientific vocabulary of our customers and, to be honest, there is no magic bullet. We don’t have a secret recipe: it’s four years of hard work collecting data, of continuously improving together with our customers to make it better and better and now we are on a level where, actually, our customers are saying, yes, we get the value out of it.”

It’s a question of getting used to it initially, something made easier by the fact that the data is offered in visual ways via the likes of phones and tablets.

“The best way to control the quality is that you directly see the input, at least initially, to train the system,” he went on, adding: “It goes right 20 times and on the 21st time there’s a mistake in a word, you correct it and LabTwin learns from it. It gets better and better and after a while you won’t need the visual any more and that’s when people trust it.”

The ‘sunglasses’ that can see everything

Hauke Heller, systems engineer at the Hamburg biotechnology company, bAhead, said the same was true of the Cobot in terms of preparing it for use. “When it comes out of the box it needs time,” he said, describing the lack of any need for inputting complex programming language to make it do the simplest tasks, having instead, a drag-and-drop element. Given that its use is now intuitive as far as the user is concerned and, how far can it go in terms of challenging human input. Does it make a better technician, for example?

“Maybe not better,” he said, adding in detail: “You eliminate all the errors a human would do. Sometimes you forget something. Say, you’ve prepared something in a microtiter plate and you’re just not sure where you left your liquid. The Cobot doesn’t forget. That’s one big step into the future. The other is everything I can do on my table, the Cobot can do as well.”

Przemyslaw Budnicki, CEO of Holo4Labs explained the advantages of a technician being able to don a pair of VR-style glasses which projects a virtual – or infinite - screen at eye level in front of them. They can activate the screen by touch, again virtually, and it will project back precise instructions, even guiding them throughout the lab. Then, with the help of visual highlights, it can even direct them to the equipment they need, even going as far as being able to indicate whether a machine requires recalibration, avoiding measurement errors - one of the main causes of incorrect results.

But unlike VR, everything merely appears as an hologram overlay. “There is a huge difference between virtual reality where you don’t see anything around you,” he said. “These are like sunglasses. Technology understands the liquids, the QR codes, the samples, voice recognition. You can say next step, set value. You can dictate notes from your experiment and it’s in the system. You can have multiple windows in front of you. You don’t need to have a laptop on your workbench. We can screen the same information.”

He insisted it wasn’t a case of changing the way we work, simply making the things we do already easier. “It’s not a big change. It is when you talk about the technology - but not the way you work.” It all presents an image of advancement and efficiency: paperless environments, no laptops on the bench, remote access to machines doing repetitive tasks.

But what if the fear that everything will become intelligent, automated, autonomous, and with AI and pattern recognition redefining experiments, scientists of the future will be waiting outside while the Cobots get on with the work? Budnicki says that goes back to why labs are slow to adopt.

“I’ve been an informatic technologist for more than 20 years and it was always a case of change management, how you change the way people work. It’s a slow process”, adding that while this is the right direction, “according to our research, laboratory scientists spend 70 per cent of the time on paperwork and only 30 per cent on doing things. It’s crazy. Our mission is to change this ratio”. How long? A decade at least. But these are laboratories, after all…

So, will cobots and AI take our jobs?

No, says Hauke Heller. It’s all about efficiency. “People will certainly find ways of using the time that technology can spare them to do more things, to develop faster, to work in different ways.” Denis Ozdemir said: “These are all really nice examples of bringing human intelligence and Artificial Intelligence closer together. There will always be some things where AI will be better – having huge amounts of data. Going through it, analysing it – is the strength of AI but finding the patterns here and there, seeing the cross connections – the creativity, this is the human [aspect] and if we can bring these together by the new interfaces. This would be really nice and that’s my ambition over the next years.”

Author

ACHEMA Inspire staff

World Show Media

www.worldshowmedia.net

Keywords in this article:

#smartlab

Find more contributions:

Detailed search in the magazine

Newsletter

Always up to date

With our newsletter you will receive current information on ACHEMA on a regular basis. You are guaranteed not to miss any important dates.

Subscribe now

Tickets
Contact