Emerging Technology

Innovation that matters.

    Physical Web

    QR Codes to Bluetooth

    We've had QR codes in our lab for years, though I don't think they've ever been the significant part of a real project (feel free to shout me down ETS colleagues). The technology behind them is solid, but their usability problems are well documented, or the butt of a joke. Despite being derided they do have industrial applications, are looked upon more fondly in China and Japan, and even Western consumers use them more than they think. It tends to be in generating the codes rather than scanning them (paying with the Starbucks app, printing airline boarding passes). Nonetheless, there is still an awkwardness in using them and their ugliness ruins many slickly designed posters.

    I was interested to hear talks by Scott Jenson and Stephanie Rieger at November's excellent Beyond Tellerrand, both discussed Google's Physical Web. The open sourced project replaces QR codes with Bluetooth beacons, transmitting URLs to nearby smart phones (no awkward scanning required). The idea is that it makes more sense to control Internet of Things (IoT) devices via the web, rather than installing an app for every connected device. Broadcasting a URL provides the link between the physical device and its web interface.

    Google's implementation (via Chrome or a Physical Web app on iOS) displays meta data for each URL, currently a title and a description when the beacon's signal is received. Stephanie talked about how nearby beacons might be integrated in to web search results and Scott described examples where the meta data is changed dynamically to provide realtime information for the beacon. This is a neat trick. The beacon continues to transmit the same fixed URL, but the meta data at that URL is changing, giving the appearance that the beacon is emitting live data. Scott gave an example of a beacon transmitting bus tracking information using this technique.

    The ETS group in the UK are already working on IoT projects and ideas: Node-RED, Conversational IoT and Smarter Buildings. The Physical Web appealed to me because it was simple. The software is web based and all the hardware can do is broadcast a URL. Counterintuitively, having such a tightly specified technology provides space and a challenge to be creative in its use. IoT feels quite messy at the moment, there's lots happening, but many different technologies and approaches are trying to gain critical mass.  I find it difficult to explain what it is. The Physical Web is trying to address one tiny part of the space and it makes sense in its own right.

    Kevin is working on using IoT to enhance the physical workspace. I'm interested in why our internal meetings are terrible, actually I'm less interested, more frustrated. We tend towards group decision procrastination. As cliched as they may be, post-it notes and a bias towards writing instead of talking have helped. In online meetings we have tools to support decision making, specifically voting, but this isn't something we use in physical meetings. It can easily be done, but isn't. As an experiment in using the Physical Web I wanted to create a voting system for physical meetings. A meeting would have a current question and attendees could vote with one click. There would be no entering URLs, downloading apps, or scanning QR codes.

    Prototyping

    I considered using a simple bluetooth beacon, something like an Estimote, with a fixed URL to represent the current meeting question. The question would be displayed on the user's phone, they'd click on it and then select the answer. It's not terrible, but there is friction. Ideally I wanted all the possible answers to be visible directly. I could have used multiple beacons, one for each answer, but you don't necessarily know how many options (and therefore beacons) a question needs. Also, we didn't have many beacons. Instead, I used a single bluetooth device connected to a computer so that I could programatically update the URL it was transmitting. There's only enough space to transmit one URL at a time, so I made the code cycle though an array of URLs - one for each answer, transmitting a URL for a few seconds before moving on to the URL for the next option. Smart phones show the beacons even if they've not transmitted for a few seconds, so the result is a list of question answers on the user's device, using just one physical bluetooth device. It may take a few seconds for all the answers to become visible, but in practice it didn't seem to be a problem

    I don't think beacons and the Physical Web were really meant to work like this. As far as I understand I'm not breaking any protocol rules, but I suspect rapidly cycling though broadcast URLs is against the spirit of beacons, if not the Physical Web. While this currently works, it is something that could break in a future Physical Web update, it's up to the Physical Web clients to decide what they make of the Bluetooth signals they receive.

    The Physical Web relies on a proxy server to supply the meta data to the smart phone. This is done for speed, but also for security, the URL needs to be accessed without the user's intervention, so going to an untrusted and unknown URL would have security implications. Scott talked about using dynamic updates of meta data by setting the cache headers on the URL's page. This isn't quite true at the moment. Google's proxy servers cache the meta data with a minimum refresh time of 5 minutes, regardless of the headers. There's nothing to stop you using your own proxy, but this would need the user to install an app, which defeats the main advantage of the Physical Web. The result of this is that a new question's meta data might not be visible for 5 minutes, which isn't particularly usable for meeting votes. To get around this I generate a new URL for each question that's asked.

    Next

    The prototype is running in our lab. The next step is to wait, this will only become useful if the Physical Web takes off. For that to happen support for it will need to be built in to both Android and iOS. It's been useful to explore the limits of what you can do wth the Physical Web and how it might be used. It's work you can only do by experimenting and pushing at the edges, something that I think the Emerging Technology group is good at doing. Whether voting in meetings is useful is moot, it's easier to explore a technology using a real problem.

    In this project there's a question over what the physical 'thing' is that I'm attaching the URL to. Usually the thing would be a device, be it a printer, parking meter or fridge. In this case I'm not sure if the physical thing is the meeting room, or the meeting itself. How would someone would know that a meeting has a Physical Web interface? If you have to put a sign up saying “Scan for this meeting's URLs”, it feels like you're not far from having a QR code again, or just a URL in English. The interface to the meeting that I've created is invisible. Is an invisible UI a good or bad thing? It's clearly a problem in this case, more thought is needed, but again it's useful to find real problems or questions by building something tangible.

    Darren Shaw, January 2016, IBM Emerging Technology


    BBC Make It Digital – Showing off our research

    Over the August bank holiday the BBC Make It Digital tour visited Cardiff as part of the Harbour Festival.  This tour is all about digital creativity and aims to inform and inspire people around the country, both children and adults alike.  I was delighted to hear that Cardiff University would be showcasing “SHERLOCK”, a new kind of digital assistant that is based on some of our collaborative research with the University over the past few years under the ongoing International Technology Alliance (ITA) research programme.

    imageOur main collaborator, Professor Alun Preece from Cardiff University, took the stage on Sunday and Monday to introduce the SHERLOCK assistant that they have been building and explain how human-machine conversation could help in everyday life and also in emergency or disaster situations.  In our joint research we are investigating ways to create a seamless and intuitive environment in which normal untrained people can interact with a system: asking questions or providing information.  Think of it as an attempt to “harness the power of people” but in a richer and more powerful and flexible way than traditional crowd-sourcing.  We propose that it is possible to treat groups of willing participants as a kind of human sensing system if these sorts of capabilities are readily available and easy to use.

    To help demonstrate the idea Alun and his colleagues had designed a crowd-sourcing game where the audience had to provide answers to a variety of “Dr Who” and “Sherlock” TV Series questions.  The inspiration for these topics is that Cardiff is a regular location for filming of these and other BBC TV series.  This game is a variation on some of our earlier research experiments in which we have enlisted the help of students at the University to collaboratively answer a series of “whodunit” style questions in a controlled environment and other work where we analysed Twitter conversations during the NATO summit in Newport in 2014.image

    For this event Alun and his colleagues had developed a new mobile interface to allow text based conversations and the audience accessed this via their mobile phones and tablets to start to collaboratively gather the answers that were required.  All of this was built using CENode.js – a javascript framework developed at the University for processing of “Controlled English” which is a key piece of the collaborative research that we have been undertaking with them.

    imageI joined in the exercise and found that my Dr Who knowledge was far poorer than I imagined, but that was ok because I took the opportunity to visit the excellent Dr Who experience after the event and managed to fill in those gaps in my knowledge!  

    The experiment worked really well on both days and the audience managed to contribute a lot of information into the environment.  The folks at Cardiff University are analysing the data now and I’m looking forward to carrying on with our collaborative research until May 2016 to see where else we can apply this interesting approach towards making agile human-machine teams a reality.

    If you are interested in more details about the research we’ve been doing you can see a variety of our publications here and here.

    Dave Braines, 14 Sep 2015


    The impact of cognitive computing on computing education

    STEMtech is an annual conference focusing on advancing education in science, technology, engineering and maths, bringing together industry, educators and government policy makers. 
     
    At this year's STEMtech, Dale Lane from IBM's ETS gave a talk explaining Cognitive Computing and introduced and led a lively discussion on how this will change the way that children are taught about computers. 
     
    A copy of the slides that were used to introduce the discussion can be found here.
     
    Anna Thomas, 18 May 2015

    Talking about Controlled English: A language for human-machine interaction

    For the last few years some of us on ETS have been doing client-funded research, and one of the things we've been investigating is whether there is a language that could be used to make it easier for normal people and machines to interact with each other. Not a programming language (we've got enough of those already!) but something aimed at capturing and sharing knowledge and information, but in a human-friendly way rather that in a technical format like XML, JSON or in a database. So we've been looking at Controlled Natural Languages and have developed our own, named "ITA Controlled English", or CE for short, and have been experimenting with this and showing proof-of-concept demonstrations to help understand what is possible and what might be most useful. We've also published a bunch of papers, recorded a few videos and released an experimental version of our development environment via developerWorks.

    image

    As well of getting lots of interest from our clients and internal contacts we also entered and jointly won the IBM internal TechConnect competition in 2014 for the UK. Our entry titled "Improving human-machine collaboration through language" told some of the CE story and we were delighted to see it come in joint first place with another entry from the ETS team in the area of information virtualisation. This gave us quite a lot of internal publicity and recently I spoke to colleagues John McNamara and David Radley in a short IBM developer hangout session talking about our work in this area and where we see the potential.

    We're looking forward to more exciting developments with this research in 2015 and I'd like to thank my colleagues David Mott and John Ibbotson for their help with the Techconnect entry and with the Controlled English research in general.

    Dave Braines, 20 Mar 2015


    Talking about Controlled English: A language for human-machine interaction

    For the last few years some of us on ETS have been doing client-funded research, and one of the things we've been investigating is whether there is a language that could be used to make it easier for normal people and machines to interact with each other. Not a programming language (we've got enough of those already!) but something aimed at capturing and sharing knowledge and information, but in a human-friendly way rather that in a technical format like XML, JSON or in a database. So we've been looking at Controlled Natural Languages and have developed our own, named "ITA Controlled English", or CE for short, and have been experimenting with this and showing proof-of-concept demonstrations to help understand what is possible and what might be most useful. We've also published a bunch of papers, recorded a few videos and released an experimental version of our development environment via developerWorks.

     

    image

    As well of getting lots of interest from our clients and internal contacts we also entered and jointly won the IBM internal TechConnect competition in 2014 for the UK. Our entry titled "Improving human-machine collaboration through language" told some of the CE story and we were delighted to see it come in joint first place with another entry from the ETS team in the area of information virtualisation. This gave us quite a lot of internal publicity and recently I spoke to colleagues John McNamara and David Radley in a short IBM developer hangout session talking about our work in this area and where we see the potential.

     

    We're looking forward to more exciting developments with this research in 2015 and I'd like to thank my colleagues David Mott and John Ibbotson for their help with the Techconnect entry and with the Controlled English research in general.

     

    Dave Braines, 20 Mar 2015

     


    Talking about Controlled English: A language for human-machine interaction

    For the last few years some of us on ETS have been doing client-funded research, and one of the things we've been investigating is whether there is a language that could be used to make it easier for normal people and machines to interact with each other. Not a programming language (we've got enough of those already!) but something aimed at capturing and sharing knowledge and information, but in a human-friendly way rather that in a technical format like XML, JSON or in a database. So we've been looking at Controlled Natural Languages and have developed our own, named "ITA Controlled English", or CE for short, and have been experimenting with this and showing proof-of-concept demonstrations to help understand what is possible and what might be most useful. We've also published a bunch of papers, recorded a few videos and released an experimental version of our development environment via developerWorks.

     

    image

    As well of getting lots of interest from our clients and internal contacts we also entered and jointly won the IBM internal TechConnect competition in 2014 for the UK. Our entry titled "Improving human-machine collaboration through language" told some of the CE story and we were delighted to see it come in joint first place with another entry from the ETS team in the area of information virtualisation. This gave us quite a lot of internal publicity and recently I spoke to colleagues John McNamara and David Radley in a short IBM developer hangout session talking about our work in this area and where we see the potential.

     
    We're looking forward to more exciting developments with this research in 2015 and I'd like to thank my colleagues David Mott and John Ibbotson for their help with the Techconnect entry and with the Controlled English research in general.

     

    Dave Braines, 20 Mar 2015


    Talking about Controlled English: A language for human-machine interaction

    For the last few years some of us on ETS have been doing client-funded research, and one of the things we've been investigating is whether there is a language that could be used to make it easier for normal people and machines to interact with each other. Not a programming language (we've got enough of those already!) but something aimed at capturing and sharing knowledge and information, but in a human-friendly way rather that in a technical format like XML, JSON or in a database. So we've been looking at Controlled Natural Languages and have developed our own, named "ITA Controlled English", or CE for short, and have been experimenting with this and showing proof-of-concept demonstrations to help understand what is possible and what might be most useful. We've also published a bunch of papers, recorded a few videos and released an experimental version of our development environment via developerWorks.

     

    image

    As well of getting lots of interest from our clients and internal contacts we also entered and jointly won the IBM internal TechConnect competition in 2014 for the UK. Our entry titled "Improving human-machine collaboration through language" told some of the CE story and we were delighted to see it come in joint first place with another entry from the ETS team in the area of information virtualisation. This gave us quite a lot of internal publicity and recently I spoke to colleagues John McNamara and David Radley in a short IBM developer hangout session talking about our work in this area and where we see the potential.

     
    We're looking forward to more exciting developments with this research in 2015 and I'd like to thank my colleagues David Mott and John Ibbotson for their help with the Techconnect entry and with the Controlled English research in general.

    Dave Braines, 20 Mar 2015


    Talking about Controlled English: A language for human-machine interaction

    For the last few years some of us on ETS have been doing client-funded research, and one of the things we've been investigating is whether there is a language that could be used to make it easier for normal people and machines to interact with each other. Not a programming language (we've got enough of those already!) but something aimed at capturing and sharing knowledge and information, but in a human-friendly way rather that in a technical format like XML, JSON or in a database. So we've been looking at Controlled Natural Languages and have developed our own, named "ITA Controlled English", or CE for short, and have been experimenting with this and showing proof-of-concept demonstrations to help understand what is possible and what might be most useful. We've also published a bunch of papers, recorded a few videos and released an experimental version of our development environment via developerWorks.

     

    image

     

    As well of getting lots of interest from our clients and internal contacts we also entered and jointly won the IBM internal TechConnect competition in 2014 for the UK. Our entry titled "Improving human-machine collaboration through language" told some of the CE story and we were delighted to see it come in joint first place with another entry from the ETS team in the area of information virtualisation. This gave us quite a lot of internal publicity and recently I spoke to colleagues John McNamara and David Radley in a short IBM developer hangout session talking about our work in this area and where we see the potential.
     
    We're looking forward to more exciting developments with this research in 2015 and I'd like to thank my colleagues David Mott and John Ibbotson for their help with the Techconnect entry and with the Controlled English research in general.

    IBM ETS and Vodafone Hackday Event

    image

    The IBM Hursley Emerging Technology Centre and ETS team hosted Vodafone on Friday 12th Dec for a one day Hackday. The focus of the day was a hypothetical Car Pooling/Commuting application, designed to encourage employees to share vehicles on their way to work. The prototype applications were rapidly created using services available in IBM's Bluemix Cloud offering.

    image image

    The Vodafone and ETS participants were split into three teams and given the whole day (excluding lunch and coffee breaks!) to come up with their application. To make things interesting, the teams were given a variety of challenges to include in their applications such as geo-fenced areas, league tables and leader boards, 'greenness' scores rewarded by premium car parking spaces nearest to the office and messaging between users of the service.

    image image

    As the day drew to a close (and after a few last minute panics!), all three teams successfully demonstrated their applications to their colleagues. Considering the short amount of time available and the limited brief provided, the range of services displayed was truly impressive – NodeRED, Twilio, MongoDB, Cloudant and what3words to name just a few.
    imageimage

    Niel de Kock, Principal IT Enterprise Architect for Vodafone Group said “What I really enjoyed about today was the experience of building a real, working application in a matter of hours. This really is a new paradigm and has helped stimulate our thoughts about how we can use it to enhance our offerings to our customers.”

    All agreed that the day was very worthwhile and had sparked many discussions about the possibilities that Bluemix provides. Four specific opportunities were identified and these will be the subject of several follow-up meetings between IBM and Vodafone in the New Year.

    Lee Fredricks, 15 Dec 2014


    Developer Eminence Lightning Talks in the Hursley Auditorium

    image ETS assisted in the organisation of a set of Developer Eminence Lightning Talks in the Hursley Auditorium earlier this month. Topics covered included writing blog posts, Google Hangouts, hackathons and how to create podcasts. To really drive home how easy the latter can be, Jon McNamara and Adrian Warmam recorded a live, seat-of-the-pants edition of HursleyFM. To stimulate conversation and provoke questions from the audience, Adrian brought along his original Sinclair ZX80 (circa 1980) and stories of frozen milk cartons placed on the case to prevent over-heating!

    Further Developer Eminence talks are planned for next year...

    image

    Lee Fredricks, 10 Dec 2014


    Who are ET?

    Based at IBM's Hursley Labs in the UK, Emerging Technology are a small and highly skilled team with experience across a wide range of industries and technologies.

    Learn more