A System-Level Brain Model for Enactive Haptic Perception in a Humanoid Robot

September 22, 2023

Ingvarsdóttir et al. presented a new model of haptic processing using the robot Epi at ICANN 2023. They describe howperception is not a passive process but the result of an interaction between an organism and the environment. This is especially clear in haptic perception that depends entirely on tactile exploration of an object. We investigate this idea in a system-level brain model of somatosensory and motor cortex and show how it can use signals from a humanoid robot to categorize different object. The model suggests a number of critical properties that the sensorimotor system must have to support this form of enactive perception. Furthermore, we show that motor feedback during controlled movements is sufficient for haptic object categorization.



Ingvarsdóttir, K. Ó., Johansson, B., Tjøstheim, T. A., & Balkenius, C. (2023, September). A System-Level Brain Model for Enactive Haptic Perception in a Humanoid Robot. In International Conference on Artificial Neural Networks (pp. 432-443). Cham: Springer Nature Switzerland.

A System-Level Brain Model for Enactive Haptic Perception in a Humanoid Robot

The Missing Link Between Memory and Reinforcement Learning

December 10, 2020

In a new publication, Balkenius, Tjøstheim, Johansson, Wallin and Gädenfors extend their earlier model of memory processing with a decision making mechanism. They describe how this memory mechanism can support decision making when the alternatives cannot be evaluated based on immediate sensory information alone. Instead we first imagine, and then evaluate a possible future that will result from choosing one of the alternatives. The model accumulates evidence over time, whether that information comes from the sequential attention to different sensory properties or from internal simulation of the consequences of making a particular choice. The authors show how the new model explains both simple immediate choices, choices that depend on multiple sensory factors and complicated selections between alternatives that require forward looking simulations based on episodic and semantic memory structures. In this framework, vicarious trial and error is explained as an internal simulation that accumulates evidence for a particular choice. It is argued that a system like this forms the “missing link” between more traditional ideas of semantic and episodic memory, and the associative nature of reinforcement learning.



Balkenius, C., Tjøstheim, T. A., Johansson, B., Wallin, A., & Gärdenfors, P. (2020). The Missing Link Between Memory and Reinforcement Learning. Frontiers in Psychology, 11, 3446.

The Missing Link Between Memory and Reinforcement Learning

A Computational Model of Trust-, Pupil-, and Motivation Dynamics

September 20, 2019

In a new publication, Tjøstheim, Johansson and Balkenius argue that machines may benefit from being able to explicitly build or withdraw trust with specific humans. The latter is relevant in situations where the integrity of an autonomous system is compromised, or if humans display untrustworthy behaviour towards the system. Examples of systems that could benefit might be delivery robots, maintenance robots, or autonomous taxis. This work contributes by presenting a biologically plausible model of unconditional trust dynamics, which simulates trust building based on familiarity, but which can be modulated by painful and gentle touch. The model displays interactive behaviour by being able to realistically control pupil dynamics, as well as determine approach and avoidance motivation.



Tjøstheim, T. A., Johansson, B., & Balkenius, C. (2019). A Computational Model of Trust-, Pupil-, and Motivation Dynamics. In HAI 2019. ACM.

A Computational Model of Trust

Cumulative inhibition in neural networks

November 03, 2018

In a new publication, Tjøstheim and Balkenius show how a multi-resolution network can model the development of acuity and coarse-to-fine processing in the mammalian visual cortex. The network adapts to input statistics in an unsupervised manner, and learns a coarse-to-fine representation by using cumulative inhibition of nodes within a network layer. We show that a system of such layers can represent input by hierarchically composing larger parts from smaller components. It can also model aspects of top-down processes, such as image regeneration.



Tjøstheim, T. A. & Balkenius, C. (2018). Cumulative inhibition in neural networks, Cognitive Processing, 1-16.

Cumulative Inhibition

Memory Model Published

April 11, 2018, 18.56

A model of memory processes was published in Frontiers in Robotics and AI. Balkenius and coworkers introduce a memory model for robots that can account for many aspects of an inner world, ranging from object permanence, episodic memory, and planning to imagination and reveries. It is modeled after neurophysiological data and includes parts of the cerebral cortex together with models of arousal systems that are relevant for consciousness. The three central components are an identification network, a localization network, and a working memory network. Attention serves as the interface between the inner and the external world. It directs the flow of information from sensory organs to memory, as well as controlling top-down influences on perception. It also compares external sensations to internal top-down expectations. The model is tested in a number of computer simulations that illustrate how it can operate as a component in various cognitive tasks including perception, the A-not-B test, delayed matching to sample, episodic recall, and vicarious trial and error.



Balkenius, C., Tjøstheim, T. A., Johansson, B. & Gärdenfors, P. (2018). From focused thought to reveries: A memory system for a conscious robot. Frontiers in Robotics and AI. doi:10.3389/frobt.2018.00029

Memory Model

Model of Pupil Dilation

January 17, 2017

A new computational model using Ikaros was published in Connections Science. Birger Johansson and Christian Balkenius present a system-level connectionist model of pupil control that includes brain regions believed to influence the size of the pupil. It includes parts of the sympathetic and parasympathetic nervous system together with the hypothalamus, amygdala, locus coeruleus, and cerebellum. Computer simulations show that the model is able to reproduce a number of important aspects of how the pupil reacts to different stimuli: (1) It reproduces the characteristic shape and latency of the light-reflex. (2) It elicits pupil dilation as a response to novel stimuli. (3) It produces pupil dilation when shown emotionally charged stimuli, and can be trained to respond to initially neutral stimuli through classical conditioning. (4) The model can learn to expect light changes for particular stimuli, such as images of the sun, and produces a “light-response” to such stimuli even when there is no change in light intensity. (5) It also reproduces the fear-inhibited light reflex effect where reactions to light increase is weaker after presentation of a conditioned stimulus that predicts punishment.



Johansson, B. and Balkenius, C. (2017). A Computational Model of Pupil Dilation. Connection Science.

Pupil Model

Exposition at LUX

January 20, 2017

A number of robots controlled by Ikaros were demonstrated at a public exposition at LUX, January 18-29, 2017.

Students at Lund University Cognitive Science demonstrated different implementations in the robot Epi. One group used Ikaros to allow Epi to pick up an object and throw it at visitors. Another had implemented different emotional expressions. There were also demonstrations of visual following other looking behaviors.

Exposition Lux

Results form the Ikaros Workshop

April 13, 2015, 23.41

The Ikaros workshop was a great success and we would like to thank everyone who contributed. It was great to hear your comments and insights. The most important conclusion was the need for a graphical editor as well as introductory tutorials with videos and example code. This will be the main focus in the next few months. Another point is the consensus concerning a server based approach.

May other interesting issues and suggestions were also discussed and we will try to incorporate
many of them in the future, one of them being a mechanism to store the state of the system.

Ikaros Workshop, 9 April

March 20, 2015

The first Ikaros workshop will be arranged at Lund University on April 9th and will include presentations from a number of researchers working with Ikaros and a discussion of the future of the system.

Homogenous matrices

April 20, 2014

The Ikaros math library has been updated with functions for homogenous matrix operations. These functions simplifies the creation and manipulation of homogenous matrices and include a number of functions that parallels those for ordinary matrices.

Theses functions form the basis for a number of new modules in Ikaros that can be used for recognition of objects in three dimensions, localize of a robot based on visual landmarks, or for forward and inverse kinematics for a robot arm. There are also interfaces in the work that will allow conversion between homogenous matrices as spatial representations and population coding of space. This will make it possible to build hybrid systems where part of the robot control system uses traditional mathematical techniques and parts of the system uses more neuromorphic methods.

A Humanoid Robot

March 7, 2013

Ikaros Humanoid

A humanoid robot controlled by Ikaros was demonstrated today at the CCL meeting in Lund. The robot has fourteen degrees of freedom that allows it show articulated movements with its head and arms. Ikaros is used for real-time control of the robot's Dynamixel servos and for visual processing of the video streams from the two eyes and depth sensor.

The control system of the robot uses the standard modules in Ikaros for nearly all processing including visual tracking, depth perception, depth segmentations, face recognition and motor control. Using a sequencer module, the robot is able to produce well timed movement as responses to people it detects in front of it.

A Faster WebUI

November 3, 2012

WebUI Speeds

The new WebUI in Ikaros 1.4 is substantially faster than the previous version. It is now possible to stream image data in full HD (1920x1080) to the web browser at 14 frames per second using the standard settings. With some tweaks it can run as fast as 20 fps.

In addition, the WebUI is now completely threaded and runs independently. This allows the WebUI to work smoothly regardless of whether Ikaros is running in play mode or in real-time mode.

We have also began to rewrite most of the user interface in HTML5 + Canvas which makes it much faster than the old SVG graphics. However, the WebUI still supports all the old SVG objects and will continue to do so for some time. First out of the new WebUI objects is the path object that replaces the functionality of the two line-objects. Many more WebUI objects will be added in the near future.

Ikaros on GitHub

September 20, 2012

Ikaros is now distributed through GitHub. This will allow us to more regularly update the public version of Ikaros. Two branches of Ikaros are available on GitHub. The stable branch contains the latest fully functional version and should be relatively bug free. The master branch contains the latest development version and may or may not work properly.

Although GitHub has been used by the project for some time, today marks the first time that this is made public. The current version is 1.4 and represents 10 months of work since the last release. In the future, there will be no official releases as we hope to regularly update the stable version.

To download the latest version of Ikaros, follow the installation instructions for each operating system.

Goal-Leaders

August 20, 2012

goal-leaders robot

Ikaros is used within the EU-funded project Goal-Leaders to control a mobile robot with an arm and a gripper. The robot is part of the Builder Scenario where a robots will perform autonomous navigation and construction tasks, and will readapt without reprogramming to novel task allocations or changes in their environment. Ikaros is used for low-level control of the arm and locomotion as well as for visual processing. YARP is used to communicate between Ikaros and other systems.

The Goal-Leaders project aims at developing biologically-constrained architectures for the next generation of adaptive service robots, with unprecedented levels of goal-directedness and proactivity.

Ikaros 1.3 Released

November 11, 2011

The newest version of Ikaros was released today. It is the tenth public distribution of the Ikaros kernel together with a number of functional modules. This version is mainly intended for potential developers and not for more general use. The version includes more than 100 bug fixes and additions. The current distribution includes versions for OS X, Linux and Windows.

The Ikaros Web UI is now complete and supports hierarchical views of all running modules. There are 16 different graphical object that can be used to monitor processes and three new controls (buttons, sliders and switches) that can be used to interact with Ikaros while it is running. The responsiveness of the WebUI is much improved and can now also operate when Ikaros is run in real-time mode. Many modules have default views that makes it possible to view their outputs without defining any additional views.

Ikaros 1.3 includes 88 standard modules implementing various learning algorithms and neural network models as well as I/O and robot control algorithms. Most of these have been thoroughly polished and include example ikc-files.

The web site has been updated to show the latest documentation for all modules and APIs.

Updated Documentation

May 22, 2011, 23.52

The documentation on this web site is being updated for version 1.2 and there is currently some inconsistencies between the different parts of the documentation.

Comments Enabled

January 21, 2011

It is now possible to add comment on many pages at the web site. This includes all the module documentation page, for example this one. This functionality is provided by DISQUS.

A Large Step for a Robot

August 18, 2010

This video demonstrates how the Dynamixel module in Ikaros can be used to control a humanoid robot. The robot is a standard BIOLOID Premium robot that is controlled over USB from a remote computer.

Each step is about the same width as the robot so there is very little space for the robot to turn around after climbing up the step. Unfortunately, the second step is approximately 5 mm higher which makes it necessary to use a different motion sequence. Using the same sequence again will make the robot fall down the stairs.

Ikaros registered with NIF

August 13, 2010


Ikaros was today registered with the
Neuroscience Information Framework (NIF)
which is a dynamic inventory of Web-based neuroscience resources: data, materials, and tools accessible via any computer connected to the Internet. An initiative of the NIH Blueprint for Neuroscience Research, the Neuroscience Information Framework advances neuroscience research by enabling discovery and access to public research data and tools worldwide through an open source, networked environment.

Version 1.2 of the IKC file specification published

June 15, 2010, 12.38

The specification of the newest version of the Ikaros Control File format is now available. The new version includes support for automatic sizing of outputs, complex delay specifications and full inheritance of attributes. In version 1.2 of Ikaros, it will be mandatory to describe each module in a corresponding IKC-file.

Version 1.2 Nearly Feature Complete

February 20, 2010, 1.18

A nearly feature complete version 1.2 of Ikaros is available as a CVS snapshot as version 1.1.456. There are still a few bugs to work out for the final release and the build system has only been tested with OS X so far but everything should be compatible with Windows and Linux.

The visible new feature is a rebuilt WebUI with some support for interactivity. There are also functions to inspect the hierarchy in an IKC-file from the WebUI as well as a graphical display of modules and connections. In addition, it now possible to build new modules using only two functions since most features of a module can be specified in the IKC-files.

Journal Article about Ikaros

September 7, 2009, 00.01

An article that describes Ikaros has been published in Advanced Engineering Informatics. The article describes the motivation for the system and how the different components work.

Batch Processing

September 6, 2009, 23.19

To allow Ikaros to be run in batch mode we are investigating different ways to automatically run several instances with different parameters. A new batch element has been introduced in the IKC-files that sets the default value for a parameter depending on the index of the process. The batch element defines a target parameter and a values-array with the different values. The inheritance mechanism in the IKC-files is used to assign the value to a module when that parameter is not specified in the module.

The batch element works like so:

    <batch target="sigma" values="0.1 0.2 0.3" />

This would assign the parameter sigma the value 0.1 for the process with index 1, 0.2 to the process with index 2 and so on. To run Ikaros in batch mode the command line argument -B should be set. This functionality will be included in version 1.2. In the future, batch mode will use MPI to run on larger computers.

Generating Graphs from IKC Files

August 26, 2009, 15.00

We are investigating ways to automatically generate graphs from an IKC file. Here a first example that uses a small set of modules and connections. The algorithm simulates physical repulsion between the modules and attraction using the connections. In addition, forces calculated on a module depends on whether a connection is an input or an output.

The layout engine is written entirely in JavaScript and the rendering uses SVG. The algorithm was inspired by the description of force based algorithms on Wikipedia.

Support for NAO

August 7, 2009, 23.52

NAO

Birger Johansson at the University of Technology in Sydney has added modules to Ikaros to control the NAO robot. This robot is a medium-sized humanoid robot, developed by Aldebaran Robotics. NAO replaced the Sony AIBO robot as the standard platform for RoboCup. Ikaros can be used to read signals from the sensors, to process the images from the cameras and to control the motors of the robot.

RSS Feed

April 7, 2009, 22.16

The Ikaros site now has an RSS feed for the news.

Running on a Supercomputer

September 30, 2008

We are currently testing Ikaros on the computer Milleotto at LUNARC. Milleotto is an IBM blade-centre solution with a total of 1008 processor cores. We are also extending Ikaros with support for MPI in addition to the ad-hoc solution that we are currently using for multiprocessing support.

Presentations at IROS

September 22, 2008

IROS

Christian Balkenius and Magnus Johnson held two presentations at the International Conference on Intelligent Robots and Systems in Nice. The first presentation described the Ikaros stytems at the workshop on "Current software frameworks in cognitive robotics integrating different computational paradigms" while the other presentation described our work with haptic perception in a robotic hand.

Version 1.1 Released

July 7, 2008

We have released version 1.1 today as an update to version 1.0.0. This update does not include the project files but must be installed over an already existing installation. The new version mainly consists of bug fixes and the complete WebUI.

Testing CUDA

July 3, 2008

We have tested the math performance on the GPU to test if it would be possible to increase the speed of Ikaros matrix operations. The results of a matrix multiplication running on NVIDIA GeForce 8800 GT are promissing since it is about 3 times faster than the currently fastest implementation in Ikaros on a 2 x DualCore Intel Xeon computer. The CUDA version ran more than 3000 times faster than the unoptimized version in Ikaros.

Toward Version 1.1

April 22, 2008

We are finishing up version 1.1 of Ikaros that will be released within a few days. The new version mainly contains bug fixes for version 1.0. In addition, a number of new WebUI object have been added. UPDATE: The new version is delayed and will probably not be ready until late summer.

MiniBot Completed

December 10, 2007

MiniBot Robot

The mobile robot MiniBot has been completed. It was built as part of the EU funded project MindRaces to study learning of anticipatory behaviors.

The robot is equipped with an arm using five digital servos and an active vision head with two degrees of freedom. The robot is controlled by a Mac Mini which was modified to run from a battery. All servos are controlled by the SSC-32 controller though the SSC32 module in Ikaros. The two motors for the wheels are controlled by the Motor Mind B through USB directly from the computer.

The robot uses Ikaros to implement a model of autonomous learning of anticipatory sensory-motor transformation. Using continuous observation of its hand and the behavior of a target object, the robot is able to learn motor behaviors that uses predictions of the motion of the target. With such aniticpatory behaviors, the robot can move its gripper to the anticipated future location of the target object.

Most of the modules used to control the robot will be available as part of release 1.1. of Ikaros.

Two New Articles

September 4, 2007

Two new articles were added to the site today. In "3D Interpretation of Edges: Part 1", Stefan Karlsson describes his work on the interpretation of depth in 2D-images and in "Tracking Colors with Ikaros", Christian Balkenius and Birger Johansson describe how a number of Ikaros modules can be combined to track colored objects.

Screen Shots of Version 1.1

August 22, 2007

Screen Shots
Screen shots from the new WebUI from the forthcomming version 1.1 are now available. The new WebUI will include several enhancements of the graphical objects. The colors of most elements can be changed and color tables can be used for more advanced graphs.

Transparency is handled automatically if several object are placed on top of each other, which makes it easy to draw information over an image. There are alse several new objects that can be used as overlays for images to plot markers or traces.

Two other new objects draw 3d plots and polar plots. Both objects can be highly customized as can be seen in the linked screen shots.

To make more elaborate layouts possible, the WebUI in version 1.1 allows the size and spacing of the objects to be set in the ikc file. This is useful if several small graphs need to be shown in a smaller area.

Finally, the new version will include a few objects which makes the interface interactive. We are currently working on button and sliders that can be used to set inputs to modules. With these objects, it becomes possible to make interactive demos where parameters are changed during execution.

Running Ikaros on a Linux Cluster

August 16, 2007

Rama

Thanks to the hard work of Alexander Kolodziej and Birger Johansson, Ikaros now runs on the eight node Linux cluster Rama.

To concurrently control six robots, a number of Ikaros processes run i parallel in real-time mode with millisecond resolution. Communication between the different processes takes place on the internal ethernet network. For robot control, there are six bluetooth channels to each of the six robots. The set-up uses real-time visual input to control the robots.

Although the current version of Ikaros (1.0) can only run in parallel on a single computer, future versions will include the functionality necessary to run multiple processes on different computers.

Tutorial at NBU in Sofia

July 28, 2007

Christian Balkenius and Birger Johansson gave a tutorial on Ikaros during the 14th International Summer School in Cognitive Science at New Bulgaria University in Sofia from July 23 to July 27.
For the tutorial, we developed a number of demos that will be included in future versions of Ikaros. One demo compares reinforcement learning with potential field methods and classical planning (A*) for robot control and navigation. Another demo illustrates how Ikaros can be used to predict the movement of a dynamic object and to learn an inverse model to control a simulated robot arm. We also showed how Ikaros can be used to control reactive behavior and for simple behavioral learning using the e-puck robot. Finally, we developed a number of demos of self-organizing maps.

NBU


Ikaros Works with Safari 3.0

June 11, 2007

Safari 3.0, which was released today runs smoothly with Ikaros. It is no longer necessary to downloads WebKit to run Ikaros on a Mac. We have nor yet confirmed that the Windows version of Safari works with Ikaros, but this is very likely. Update: The Window version works well with Ikaros and is the fastest way to run the WebUI on a Windows machine.

Ikaros Version 1.0 Released

June 1, 2007

Ikaros 1.0 was released today. This is the eight public distribution of the Ikaros kernel and a small set of modules. This version is mainly intended for potential developers and not for more general use. Only a minimal set of modules are included.

The new version includes an updated multithreded kernel with support for real-time execution, a completely reworked WebUI, an optimized math library, support for the new IKC files, as well as versions for Linux, OS X and Windows. All documentation is also available on-line at this site. This version represents six years of development work since the project started in early 2001. Download Ikaros.

Talk at Trolleholm

May 15, 2007

Trolleholm

Christian Balkenius gave a presentation on Ikaros at the yearly research meeting of Lund University Research Program in Medical Informatics, (LUMI) at Trolleholm castle outside Lund in Sweden.

The talk presented Ikaros from an medical informatics perspective and focused on the model validation aspects of the systems where brain models can be automatically tested against neuroscientific databases.

Released Date for Version 1.0

May 14, 2007

Ikaros version 1.0 will be released on June 1, 2007. The new version will include an updated multithreded kernel with support for real-time execution, a completely reworked WebUI, an optimized math library, support for the new IKC files, as well as versions for Linux, OS X and Windows. All documentation will also be available on-line. This version represents six years of development work since the project started in early 2001.

Module Documentation

May 5, 2007

The module documentation is now live. The list of modules represents the standard modules that are planned for inclusion in version 1.0. This list is smaller than the number of modules distributed with earlier version since we only wanted to include those modules that have been thoroughly tested with the new version. The module documentation is now automatically generated from the IKC files that are used to describe modules to the Ikaros kernel.

Updated Documentation

April 18th, 2007

A number of new articles have been added that describe Ikaros version 1.0. "System-Level Cognitive Modeling with Ikaros" gives an overview of the project and describes the functionality of the new version. The article "The Ikaros Math Library" descibes the new mathematical functions and how they are used. Previous versions of the programming guides have also been updated to reflect the current version.

A New Design for the WebUI

January 15th, 2007

We are working hard toward version 1.0 of Ikaros. For the new version, the user interface has been reworked to allow for easy switching between different views. The picture below shows the design of the new user interface running in WebKit. It is also compatible with Firefox.
The WebUI uses JavaScript+SVG to render dynamic views of the operation of an Ikaros process. It contains classes that display images in color, grayscale or pseudocolor; bar graphs and plots, as well as vector fields and grids. The basic set of user interface elements can easliy be extended using small pieces of JavaScript.

The new WebUI


An Optimized Math Library

January 14th, 2007

The next version of Ikaros will include an optimized math library that is used by all the standard modules. The functions in the library operates on the three types of data used in Ikaros: scalars, arrays and matrices.

Matrix operations include vector and matrix operations such as addition, subtraction, multiplication and division and will use the BLAS/ATLAS library when it is available on OS X, Linux and Windows.

The library also contains a number of image processing functions including convolution and morphological operations. When running under OS X, these library makes use of the highly optimized vImage framework for image processing. Ikaros can also be compiled to use the vDSP and vForce libraries.

The new math library greatly increases the execution speed of Ikaros. In some cases, the speed has increased by a factor of ten or more compared to the previous Ikaros version. Because the math library is also used by the WebUI, it makes for a more responsive user interface. For example, it is now possible to view image processing of an image from a camera in real time.

The Contribution Section is Open

November 27th, 2006

The contributions section of this site was opened today. The contributions section will contain recent updates to modules and externally developed modules. A submission form is available for those who want to contribute modules to the project.

Ikaros Support for the e-puck Robot

November 20th, 2006

The e-puck robotIkaros now supports the e-puck developed at EPFL, which is a small mobile robot with a built in color camera and eight IR proximity sensors and three acceleration sensors together with a number of LEDs and wireless blutooth communication.

A new module has been added to Ikaros to handle communicaion with the robot. Several modules can be run in parallel to control several robots.
 
 

Second Draft Specification of Ikaros Control Files

November 8th, 2006

An updated draft specification of Ikaros control files (IKC) has been released today. The new file format supports automatic generation of help files from IKC files. An experimental list of automatically generated help pages is available. Comments on the draft are very welcome.

Tutorial on Feature Analysis at EpiRob

September 6th, 2006

Christian Balkenius and Christopher G. Prince will have a tutorial on featural processing of auditory and visual inputs at the Sixth International Workshop on Epigenetic Robotics in Paris, Friday September 22. The tutorial will cover Ikaros, YARP and Intel Open CV.

A feature is an auditory or visual elements at a level of processing more abstract than that of the raw signal typically used as input in vision and audition (e.g., grayscale pixels of visual frames, or amplitude samples of audio), but less abstract than visual elements such as objects or auditory elements such as words. Perceptual features are needed for various tasks, such as learning auditory or visual categories. The raw sensory data (e.g., grayscale pixels for visual, or amplitude samples for audio) are typically too noisy (e.g., sensitive to illumination variation in the case of visual) to be used directly in these tasks.

The intent of this tutorial is to be as practical as possible, with attendees subsequently being able to make use of each of these techniques if they so please. To this end, we intend to provide working program code (through web links), and examples of usage. Background theory will be provided where this helps understanding, but the main goal is for people to come away with code they can use and a clear idea of where they might apply these techniques in their own projects.

Ikaros runs with WebKit

September 1st, 2006

The nightly builds of WebKit for OS X are now compatible with Ikaros as its implementation of SVG and JavaScript includes all the required functionality. We are looking forward to the first version of Safari to include this functionality which would add a second Ikaros compatible browser to OS X in addition to Firefox.

Tentative Specification of Ikaros Control Files

May 23th, 2006

A tentative specification of Ikaros control files (IKC) has been released today. Comments on the draft are very welcome.

A New Tool for Model Validation

May 10th, 2006

System level models of the brain are widely used in the cognitive sciences. Validating the architecture of such models against neurobiological facts often involves extensive literature research. The Evaluator of System Level Models (ESYLM), which was designed by Sepp Kollmorgen and Sylvia Schröder, finds relevant literature and constructs arguments for a model's architecture by performing inferences on neurobiological data. The set of allowed inference rules can be specified according to the researcher's paradigms. ESYLM's inference procedure allows for a faster computation than simple search algorithms do. As the result of the evaluation a written report is produced.
The imlemented system works on Ikaros model descriptions in XML combined with data extracted from the CoCoMac database and is implemented in a combination of Java and Prolog and uses LaTex to neatly format the validation report.

New Web Site

March 23d, 2006

The new web site for the Ikaros project will soon appear here. During the transition, all material will continue to be available at the old site.

A Multi-Threded Kernel

February 2, 2006

The new multi-threaded kernel was tested for the first time today. The new kernel will allow Ikaros to automatically take advantage of multiple processors. In addition, it makes it possible to run modules at different frequencies which makes robot control much easier. For example, a motor control loop can run in the kHz range while the visual system may run at a only few Hz. This functionality will be available in version 0.8.2.

Version 0.8.1 Released

November 25th, 2005

Ikaros version 0.8.1 released. The new version includes a web based user interface where running Ikaros processes can be controlled and monitored from a web browser. The interface uses SVG for visualization and works with FireFox 1.5 as well as with Adobe's SVG plug in.

First Version of the Web Viewer Running

September 6th, 2005

The first version of the new web based viewer for Ikaros was running for the first time today. Ikaros is extended with a small web server that interacts with a web browser to show the state of a running Ikaros process. The browser side of the viewer combines JavaScript and CSS with SVG rendering of images and graphs. A plug in interface similar to that used for Ikaros modules has been developed that allows arbitrary visual elements specified in SVG. The web interface requires the next version of Firefox which will soon be released and has been tested with Deer Park Alpha 2.

SenseStream ported to Ikaros

August 28, 2005

Chris Prince at University of Minnesota Duluth has ported the SenseStream system to Ikaros. The system implements synchrony detection based on the Hershey and Movellan (2000) algorithm. This method computes the mutual information between the audio and visual streams of an QuickTiime input file. It also uses the centroid of this mutual information to segment out faces from the video.

The original system is decribed in Prince, C. G. & Hollich, G. (2005). Synching models with infants: A perceptual-level model of infant audio-visual synchrony detection. Journal of Cognitive Systems Research, 6, 205-228.

Screendump of SenseStream runnning in Ikaros


Ikaros + BoeBot + Bluetooth

February 1, 2005

Two BoeBots controlled by Ikaros Birger Johansson has developed Ikaros modules for Bluetooth communication with BeoBot robots. The communication protocol allows Ikaros to control many robots at the same time.

The goal is to study the collective behavior of a number of robots in various tasks. At present, the individual robots do not have any sensory systems. Instead an overhead camera tracks the motion of all the robots and gives Ikaros their current coordinates.

The robot set-up is used within the EC funded project MindRaces.

Version 0.8.0 Released

January 26, 2005

Ikaros version 0.8.0 released. The new version includes 64 standard modules and 37 contributed modules. There are several new features including the ability to delay signals between modules in a simple way. It is even possible to have no delay at all between modules to speed up simulations. There are also several new ways to schedule the execution of modules.

Work toward version 0.8.0

January 18, 2005

The release of Ikaros version 0.8.0 is planned before the end of January 2005. The new version will include 100 modules and several new features including the ability to delay signals between modules in a simple way. It is even possible to have no delay at all between modules to speed up simulations. There will also be several new ways to schedule the execution of modules.

Subproject on 3D Interpretation of Edges Completed

August 1, 2004

A number of modules have been added to Ikaros within this subproject. These modules process line images and finds illusory contours, performs amodal completion and sorts surfaces in depth based on local images cues such as T-junctions and line-endings.

Output from the system.


Ikaros + Keepon

July 1, 2004

Jan Moren with the Keepon. Jan Morén at NICT in Japan is controling the robot Keepon with Ikaros. The robot, which was designed by Hideki Kozima, uses Ikaros to to grab images and to locate faces and a toy rabit.

A number of new Ikaros modules have been developed for high-speed image processing including fast visual template matching and color based image segmentation.
 
 
 

Somatosensory Processing Model Completed

June 15, 2004

The different somatosensory processing steps The first part of the subproject "Somatosensory Processing and Cortical Plasticity" has been completed. The developed model includes processing stages corresponding to the palm, medulla, thalamus, and somatosensory cortex and is used to simulate reorganization of somatosensory cortex after nerve injury. Read more »

The figure shows the activity in the different processing steps in one of the models without nerve injury. The different images illustrates the sensory input, the initial coding, two intermediate steps, and the cortical coding of the stimulus.

First Brain Database

June 3, 2004

A minimal prototype of the brain database has been implemented as a proof of concept. The implementation uses semantic web techniques to store brain data and to reason about them. It also illustrates how it is possible to validate an Ikaros experiment file agains the brain data. Read more »

Ikaros Controls a Mobile Robot

May 7, 2004

Birger Johansson with LUCSOR III For the first time, Ikaros is used to control a mobile robot. The robot LUCOR III is built for outdoor navigation. It uses a number of Ikaros modules for motor control, to grab images from the camera and to control the pan/tilt camera head. In addition, Ikaros modules for image processing including elastic template matching are used for visual navigation.

Ikaros runs on a standard Linux computer on the robot which is powered by two 12V accumulators that are also used for the motors. The computer uses two serial connections to talk to the control system/drivers that control speed and acceleration of the two motors. The camera head is controlled by an ethernet connection.

The robot LUCSOR III was build by Birger Johansson from the disaster called LUCSOR II left by Christian Balkenius and Jan Morén.

Ikaros 0.7.7 Released

March 22, 2004

Version 0.7.7 of Ikaros has been released. The new version contains many bug-fixes and a number of new modules

Version 0.7.6 Released

October 31, 2003

Ikaros version 0.7.6 is now available for download for Linux and OS X. This distribution includes the command line version of Ikaros with many minor changes and bug fixes.