Emerging Technology

Innovation that matters.

    Talking about Controlled English: A language for human-machine interaction

    For the last few years some of us on ETS have been doing client-funded research, and one of the things we've been investigating is whether there is a language that could be used to make it easier for normal people and machines to interact with each other. Not a programming language (we've got enough of those already!) but something aimed at capturing and sharing knowledge and information, but in a human-friendly way rather that in a technical format like XML, JSON or in a database. So we've been looking at Controlled Natural Languages and have developed our own, named "ITA Controlled English", or CE for short, and have been experimenting with this and showing proof-of-concept demonstrations to help understand what is possible and what might be most useful. We've also published a bunch of papers, recorded a few videos and released an experimental version of our development environment via developerWorks.

    image

    As well of getting lots of interest from our clients and internal contacts we also entered and jointly won the IBM internal TechConnect competition in 2014 for the UK. Our entry titled "Improving human-machine collaboration through language" told some of the CE story and we were delighted to see it come in joint first place with another entry from the ETS team in the area of information virtualisation. This gave us quite a lot of internal publicity and recently I spoke to colleagues John McNamara and David Radley in a short IBM developer hangout session talking about our work in this area and where we see the potential.

    We're looking forward to more exciting developments with this research in 2015 and I'd like to thank my colleagues David Mott and John Ibbotson for their help with the Techconnect entry and with the Controlled English research in general.

    Dave Braines, 20 Mar 2015


    Talking about Controlled English: A language for human-machine interaction

    For the last few years some of us on ETS have been doing client-funded research, and one of the things we've been investigating is whether there is a language that could be used to make it easier for normal people and machines to interact with each other. Not a programming language (we've got enough of those already!) but something aimed at capturing and sharing knowledge and information, but in a human-friendly way rather that in a technical format like XML, JSON or in a database. So we've been looking at Controlled Natural Languages and have developed our own, named "ITA Controlled English", or CE for short, and have been experimenting with this and showing proof-of-concept demonstrations to help understand what is possible and what might be most useful. We've also published a bunch of papers, recorded a few videos and released an experimental version of our development environment via developerWorks.

     

    image

    As well of getting lots of interest from our clients and internal contacts we also entered and jointly won the IBM internal TechConnect competition in 2014 for the UK. Our entry titled "Improving human-machine collaboration through language" told some of the CE story and we were delighted to see it come in joint first place with another entry from the ETS team in the area of information virtualisation. This gave us quite a lot of internal publicity and recently I spoke to colleagues John McNamara and David Radley in a short IBM developer hangout session talking about our work in this area and where we see the potential.

     

    We're looking forward to more exciting developments with this research in 2015 and I'd like to thank my colleagues David Mott and John Ibbotson for their help with the Techconnect entry and with the Controlled English research in general.

     

    Dave Braines, 20 Mar 2015

     


    Talking about Controlled English: A language for human-machine interaction

    For the last few years some of us on ETS have been doing client-funded research, and one of the things we've been investigating is whether there is a language that could be used to make it easier for normal people and machines to interact with each other. Not a programming language (we've got enough of those already!) but something aimed at capturing and sharing knowledge and information, but in a human-friendly way rather that in a technical format like XML, JSON or in a database. So we've been looking at Controlled Natural Languages and have developed our own, named "ITA Controlled English", or CE for short, and have been experimenting with this and showing proof-of-concept demonstrations to help understand what is possible and what might be most useful. We've also published a bunch of papers, recorded a few videos and released an experimental version of our development environment via developerWorks.

     

    image

    As well of getting lots of interest from our clients and internal contacts we also entered and jointly won the IBM internal TechConnect competition in 2014 for the UK. Our entry titled "Improving human-machine collaboration through language" told some of the CE story and we were delighted to see it come in joint first place with another entry from the ETS team in the area of information virtualisation. This gave us quite a lot of internal publicity and recently I spoke to colleagues John McNamara and David Radley in a short IBM developer hangout session talking about our work in this area and where we see the potential.

     
    We're looking forward to more exciting developments with this research in 2015 and I'd like to thank my colleagues David Mott and John Ibbotson for their help with the Techconnect entry and with the Controlled English research in general.

     

    Dave Braines, 20 Mar 2015


    Talking about Controlled English: A language for human-machine interaction

    For the last few years some of us on ETS have been doing client-funded research, and one of the things we've been investigating is whether there is a language that could be used to make it easier for normal people and machines to interact with each other. Not a programming language (we've got enough of those already!) but something aimed at capturing and sharing knowledge and information, but in a human-friendly way rather that in a technical format like XML, JSON or in a database. So we've been looking at Controlled Natural Languages and have developed our own, named "ITA Controlled English", or CE for short, and have been experimenting with this and showing proof-of-concept demonstrations to help understand what is possible and what might be most useful. We've also published a bunch of papers, recorded a few videos and released an experimental version of our development environment via developerWorks.

     

    image

    As well of getting lots of interest from our clients and internal contacts we also entered and jointly won the IBM internal TechConnect competition in 2014 for the UK. Our entry titled "Improving human-machine collaboration through language" told some of the CE story and we were delighted to see it come in joint first place with another entry from the ETS team in the area of information virtualisation. This gave us quite a lot of internal publicity and recently I spoke to colleagues John McNamara and David Radley in a short IBM developer hangout session talking about our work in this area and where we see the potential.

     
    We're looking forward to more exciting developments with this research in 2015 and I'd like to thank my colleagues David Mott and John Ibbotson for their help with the Techconnect entry and with the Controlled English research in general.

    Dave Braines, 20 Mar 2015


    Talking about Controlled English: A language for human-machine interaction

    For the last few years some of us on ETS have been doing client-funded research, and one of the things we've been investigating is whether there is a language that could be used to make it easier for normal people and machines to interact with each other. Not a programming language (we've got enough of those already!) but something aimed at capturing and sharing knowledge and information, but in a human-friendly way rather that in a technical format like XML, JSON or in a database. So we've been looking at Controlled Natural Languages and have developed our own, named "ITA Controlled English", or CE for short, and have been experimenting with this and showing proof-of-concept demonstrations to help understand what is possible and what might be most useful. We've also published a bunch of papers, recorded a few videos and released an experimental version of our development environment via developerWorks.

     

    image

     

    As well of getting lots of interest from our clients and internal contacts we also entered and jointly won the IBM internal TechConnect competition in 2014 for the UK. Our entry titled "Improving human-machine collaboration through language" told some of the CE story and we were delighted to see it come in joint first place with another entry from the ETS team in the area of information virtualisation. This gave us quite a lot of internal publicity and recently I spoke to colleagues John McNamara and David Radley in a short IBM developer hangout session talking about our work in this area and where we see the potential.
     
    We're looking forward to more exciting developments with this research in 2015 and I'd like to thank my colleagues David Mott and John Ibbotson for their help with the Techconnect entry and with the Controlled English research in general.

    IBM ETS and Vodafone Hackday Event

    image

    The IBM Hursley Emerging Technology Centre and ETS team hosted Vodafone on Friday 12th Dec for a one day Hackday. The focus of the day was a hypothetical Car Pooling/Commuting application, designed to encourage employees to share vehicles on their way to work. The prototype applications were rapidly created using services available in IBM's Bluemix Cloud offering.

    image image

    The Vodafone and ETS participants were split into three teams and given the whole day (excluding lunch and coffee breaks!) to come up with their application. To make things interesting, the teams were given a variety of challenges to include in their applications such as geo-fenced areas, league tables and leader boards, 'greenness' scores rewarded by premium car parking spaces nearest to the office and messaging between users of the service.

    image image

    As the day drew to a close (and after a few last minute panics!), all three teams successfully demonstrated their applications to their colleagues. Considering the short amount of time available and the limited brief provided, the range of services displayed was truly impressive – NodeRED, Twilio, MongoDB, Cloudant and what3words to name just a few.
    imageimage

    Niel de Kock, Principal IT Enterprise Architect for Vodafone Group said “What I really enjoyed about today was the experience of building a real, working application in a matter of hours. This really is a new paradigm and has helped stimulate our thoughts about how we can use it to enhance our offerings to our customers.”

    All agreed that the day was very worthwhile and had sparked many discussions about the possibilities that Bluemix provides. Four specific opportunities were identified and these will be the subject of several follow-up meetings between IBM and Vodafone in the New Year.

    Lee Fredricks, 15 Dec 2014


    Developer Eminence Lightning Talks in the Hursley Auditorium

    image ETS assisted in the organisation of a set of Developer Eminence Lightning Talks in the Hursley Auditorium earlier this month. Topics covered included writing blog posts, Google Hangouts, hackathons and how to create podcasts. To really drive home how easy the latter can be, Jon McNamara and Adrian Warmam recorded a live, seat-of-the-pants edition of HursleyFM. To stimulate conversation and provoke questions from the audience, Adrian brought along his original Sinclair ZX80 (circa 1980) and stories of frozen milk cartons placed on the case to prevent over-heating!

    Further Developer Eminence talks are planned for next year...

    image

    Lee Fredricks, 10 Dec 2014


    Emerging Technology Centre Innovation Day

    The Emerging Technology Centre (ETC) was a hive of activity earlier this month, playing host to back-to-back events in support of the UK company’s strategic initiatives – Cloud, Analytics, Mobile, Social & Security (CAMSS), and more.

    At the first event, colleagues from across the business witnessed a series of new and innovative demonstrations taken from client engagements this year, with a view to bringing their own clients along to see how IBM can help solve their technology problems. And at the second event, several analysts were invited to see demonstrations and hear talks from experts on IBM solutions. Both events received overwhelmingly positive feedback, with one analyst describing the experience as ‘awesome’.

    ‘Potential differentiator’

    Explaining the background to the events, Lee Fredricks, Emerging Technology Client Services Manager, says: “In his 2Q Performance Update, David Stokes, General Manager, IBM UK & Ireland, cited the opening of the Emerging Technology Centre as one of several key investments made earlier this year that will help enable IBM UK to accelerate growth in our strategic initiatives, describing the centre as ‘a forum to innovate with our clients around their most challenging business issues’.

    “He stated that it was now up to us to capture the opportunity that is in front of us – and that was what we set out to achieve through holding these events: to get across to our colleagues the diversity of areas that Emerging Technology Services (ETS) can assist them with, as they focus on 4Q.”

    David is, indeed, a great supporter of what the Emerging Technology Centre – and the lab in general – can bring to bear in client engagements. In the comments section of his blog, he writes: “I learnt the value of the Hursley back in my days leading the WebSphere team in the UK – we have so much talent… and we must engage our colleagues there in opportunities. Hursley is a potential differentiator for IBM in the UK in many sales opportunities."

    image

    Demos, Lightning Talks and guest speakers

    Describing the first event in more detail, Lee says: “We invited our colleagues from Global Business Services (GBS), Global Technology Services (GTS), Strategic Outsourcing (SO), Chief Technology Office (CTO) and Sales & Distribution (S&D) to come and view a raft of new demonstrations taken from our client engagements in 2014. These varied from Buildings Infrastructure Management controlled by an Oculus Rift, to the handling of sensitive information over the web using Fully Homomorphic encryption to monitoring distributed stock levels using Gaian data federation!

    “In addition, we ran a programme of Lightning Talks throughout the event, which were well attended. We heard how ETS developers have been involved in the fight against cancer; about new open source assets such as NodeRED and Edgware, which were developed by ETS and are now 'out there' with thriving communities; about the state-of-the-art in Bluemix and Watson; about recent developments with the SyNAPSE chip and neuromorphic technology; and much more besides.” (A full list of the Lightning Talks given is provided below.)

    Lee continues: “We were also lucky enough to have four esteemed guest speakers who recounted their experiences of working with ETS. Trevor Davis, Global Subject Matter Expert (SME) in Consumer Products, and a Distinguished Engineer, described how ETS assisted in the task of boot-strapping manufacturing facilities with sensors and instrumentation, bringing them into the world of shared data and the Internet of Things (IoT). Sam Seddon, Wimbledon Client Executive, described the most recent Wimbledon Social Command Centre and the challenges of Social computing. Tony Morgan, Client Chief Innovation Officer, Strategic Outsourcing, described a wealth of projects for which he has engaged ETS, including Innovation Days, dashboards for War Rooms and gamification. And Richard Smith, Chief Architect, Operational Information Systems, described the close relationship between ETS and the International Technology Alliance (ITA), a consortium involving many eminent Research, Academic and Industrial partners.

    “The feedback from the event was excellent with praise for the speakers and demonstrations, with many expressing the view that they had not appreciated the range of projects and technologies ETS and IBM are involved in. Many more have promised to bring their clients into the Emerging Technology Centre so that ETS can show them how we can solve their technology problems. We look forward to welcoming them very soon!”

    ‘Awesome’

    On the second day, the Centre welcomed over a dozen analysts from companies such as Gartner, Forrester, IDC, Freeform Dynamics, Lustratus and Beecham Research.

    “The analysts made their way around pedestals focused on Cognitive Computing, Cloud Computing, Big Data & Analytics, Mobile Computing, Social Business, Smarter Planet and Security, listening to our SMEs describe IBM's point of view and the recent projects that the Emerging Technology team have delivered in these areas,” says Lee.

    “Once again, the feedback was excellent with one analyst overheard as saying, there's only one word I can use to describe what I've seen, and that is ‘awesome’!”

    Lightning Talks

    Talk Presenter Summary
    Innovation Thinking Kevin Turner Climate and culture for innovative thinking
    Integrating the physical world Christopher Gibson Edgware Fabric is a new open source technology developed by ETS that integrates people and devices at the very edge of the network, bringing the benefits of middleware to the Internet of Things.
    The Bluemix Garage - IBM as a Start-up Andy Bravery With Bluemix IBM now has an offering that is relevant and accessible to tech start-up companies and even individual developers. The challenge is how to sell to this market segment which is unlike any of our traditional customer sets. Bluemix Garage is about meeting these new potential clients on their own turf and encouraging them to make Bluemix the platform of choice for their technology-led businesses as they position themselves for a high growth future.
    Security - What matters most? Saritha Arunkumar ETS works on various interesting and unique aspects of Security, the things that matter most. Come hear about location security, Geo-spatial access control, Biometrics and modern cryptography
    Node-RED: a year in the life of an open-source project Nick O'Leary Node-RED: a year in the life of an open-source project
    Tackling Cancer with Machine Learning Graham White Recent ETS involvement with Cancer Research UK
    Watson Update Andy Naylor ETS involvement in Watson development
    Transatlantic research with the ITA John Ibbotson Innovation from ETS as leader of the International Technology Alliance research programme
    Gaian: And you thought you had seen it all in the information management space? Patrick Dantressangle Introduction to the ETS Gaian asset and the part it can play in information federation

     

    Please contact ets@uk.ibm.com for more details on any of these talks.


    Automatic detection of anomalies in sensor data from buildings

    Clients who maintain buildings spend many months building rules to create alarms when something goes wrong. However:

    1. It’s time consuming and tedious to manually enter all the rules
    2. Even once a large number of rules have been established, the system is still rather fragile and produces lots of false alarms.  This is expensive because it wastes the engineers’ time.
    3. The system cannot learn over time (e.g. a new employee who likes to keep their office very cold).

    The objective of this piece of work is to use IBM’s “big data” tools (both from the building and outside it) to learn the conditions that require an engineer’s attention without writing rules, and make it simpler and much cheaper when building, or extending an existing building.  Instead of requiring rules to be manually entered it would sift through all the historical sensor data from the building to learn the dominant patterns and relationships.  Once a model has been learnt, the system should produce an ‘anomaly’ score for any new data, and could also update its model over time.  If a suitable model is used (a ‘predictive’ model as opposed to a ‘discriminative’ model) then the system could also make predictions.  Our plan is to build a prototype of this system, whilst also recording a large, home-grown dataset to show off IBM’s big data tools.

    Data collection

    The Hursley building management system has over 15,000 objects.  These are connected using a protocol called BACnet (Building Automation and Control NETworks) over IP.  This BACnet IP network is physically separate from IBM’s ’9′ network.  We have written an application which continually polls all 15,000 objects on the network to request their present value.  It takes a little under an hour to poll all 15,000 objects once.  We store the data locally on the logging machine.  Every midnight, the logging machine disconnects from the BACnet, connects to the IBM ’9′ network and squirts the last day’s data to the ETS instance of BigInsights and then reconnects to the BACnet to continue data collection.

    Spotting patterns

    Before we can build a statistical model of the data, we need to visualise it to get a feel for what’s going on.

    The plot below shows 3 weeks of data for about 100 BACnet objects.  The X axis represents time.  X axis ticks and grid are positioned daily at midnight.  Each row represents a sensor (i.e. each tick on the y-axis is a single sensor).  The sensors have been ordered by how well they correlate (so sensors close together behave similarly).   The output for each sensor has been linearly mapped to the range 0 to 1.  Red indicates missing data.

    Some objects show daily and weekly patterns (for example the ‘cooling setpoints’ and the ‘internal room temperatures’ marked on the plot below).  Some objects do not appear to follow any obvious pattern.  Some objects return discrete values (e.g. ‘active’ or ‘inactive’) whilst others report continuous values.

    image

     Modelling approaches

    The next phase of the project will be to build statistical models for the data.  The first approach will be to build models using fairly simple statistical techniques (probably little more complex than is taught on A-Level stats modules).  For each continuous-valued objects which follow a regular daily pattern, we will learn a simple normal distribution for each hour of the day.  For continuous-valued objects which do not follow a daily pattern, we will just use regression.  For discrete-valued objects, we will try using Markov chains conditioned on the hour of the day.  Once this is done, we will look at simple ways to model correlations between objects.

    In parallel with this statistical modelling, we will build a pretty user interface to show to clients.

    If this all works, and if there’s time left then we might try some sexier statistical techniques like recurrent neural networks (especially ‘long short term memory (LSTM)‘, which excels at modelling time series data).  These models are computationally expensive to train so we might need to train on a fast GPU.  Or, further into the future, maybe the code could be re-implemented on IBM’s new ‘neurosynaptic’ chips produced under the SyNAPSE project.

    Jack (aka Daniel) Kelly, 16 Sep 2014


    Oculus Rift Hack Day (and beyond)

    For the ETS hack day, the team of Peter J, Markus, Hamish, Yi and I experimented with the Oculus Rift (OR) which we had been lent.

    We started by deciding the goal for the project – To create an interactive virtual environment that assists in controlling the real environment (including display of real-world data in the virtual world).

    The aim would be to help

    • Trainee engineers – they could learn how to maintain buildings to some extent without having to be in the building
    • Emergency services rescue planning tool
    • Engineers to peel back the building to see locations of pipes etc
    • ETS – gaining experience in Virtual Reality, Augmented Reality and 3D Models

    There were 5 activities that needed to be done:

    • Get the 3D model showing in the OR.
    • Change the socket based interface to the client centre for an MQTT one
    • Publish MQTT messages from within virtual environment
    • Allow highlighting of objects from within the model
    • Create a control menu within the virtual environment.

    By the end of the first day, the 3d model was working reasonably well in the OR, and the MQTT interface to the lab was ready to test. MQTT messages could be published from within the environment. There was a lot more work to be done to embed html within the virtual environment which did not move around with the world.

    By the end of the second day, the five activities were complete – but did not all string together seamlessly.

    Markus and Yi (and a bit of Peter) have since spent a couple of days ironing out the bugs and improving the performance – and this led to its first customer demonstration to a large energy company, in the context of Building Information Modelling, on Wednesday. A nice demonstration is now packaged with start-up script, and available for lab tours in the ETS lab.

    Kevin Brown, 12 Jun 2014


    Who are ET?

    Based at IBM's Hursley Labs in the UK, Emerging Technology are a small and highly skilled team with experience across a wide range of industries and technologies.

    Learn more