Session Report - Bifco Panel Repair

Moderator: Torsten Layda (
Recorder: Somik Raha (

The session began with a 30 minute study of the requirements document of the Bifco Panel Repair project. Mohammad Mohammadi started off with a question on disassembling the panel - and it was agreed that the panel was not to be disassembled. Christian asked a question on the environmental chamber - and it was clarified that this might be a dust-free chamber.

The next question to be tackled was on panels - what are they? Do we care? The group agreed that we were only discussing 3-D panels. It could probably be aircraft panels - given that the author of the problem was from Boeing. And we didn't really care about the type of panels.

In the diagram on page 3, we discussed whether the rotary arm (2) could be in the exclusion zone. The agreement was that it could - it was the tool head that should not be in the exclusion zone. A lot of time was spent understanding the diagrams on page 3. It was pointed out that the two diagrams were of separate robots, and that our software needed to support both. Future robots might combine both functionalities (linear and rotary) and should be easily supported by our software.

It was clarified that we were designing software and not a new robot with the two functionalities. Regardless of the type of robot, our software should work.

A lot of time was spent discussing the count on page 6 - the group tried understanding the smallest unit, and it's interfacing with the end-user (does he specify counts, or some other units).

A discussion on the automatic mode led to the conclusion that it was not truly automatic (semi-automatic).

Mohammad suggested a specific context of the application - that of painting a surface (probably airplanes) - and it did help the group get a better picture. Though, there were some cases where the example didn't fit completely.

At this point, a story-based approach was suggested and the group set about writing user stories (shown below) for the system.

Click to enlarge Click on the picture to enlarge it.

Around the same time, issues related to the stories were also discussed, and noted.

The moderator introduced the metaphor of the joystick - that helped clarify specification 1. b (in the scope)- which talked about ensuring that automatic movement was restricted only to the work area. It was agreed that speed and direction were important for a painting application.

The boundary condition was discussed- that the tool head moves until it hits a wall or is stopped.

There was a lot of discussion on voids, and how they were formed in a polygon.

There were some initial comparisons with exclusion zones - a point was then made about voids being automatically formed in irregular polygons. At this juncture, the group decided to leave this as an issue, for possible clarification with the author of the document, and move on.

Zone definitions were discussed at length - are they easier to define as rectangles or polygons. This led to a discussion on the use of the remote controller. It was clarified that the remote controller would make it possible to define zones - and this would be done not by tracing freehand irregular polygons, but by marking vertices of the polygon. There was a discussion on the artifact created at the end of the rotary/linear motion. It could be points or a motion abstraction (for the rotary case). There was some talk on closing the polygon, and that led to a discussion on the need for editing the polygon - to fine tune the vertices.

The real-time display was discussed at length. Does it show features during digitizing? The group didn't think so. This was confirmed from the specification document.

Joseph was concurrently recording the assumptions being made.

Click to enlarge
Click on the picture to enlarge it.

There were more display questions - how does point 3 (in "Scope") square with point 2 (in "System Interface Scope")? One talks about the graphics context being provided to our system, and the latter drawing its state on the context. The other talks about our system informing the display about its tool-head and arm positions with the former doing the job of drawing appropriately. This was resolved to mean that the first case was about system status being displayed (what state the system is in - busy, current job-description, etc.).

Segments were discussed at length. It was agreed that this was a term describing the motion of the tool head.

At this point, Mohammad came forward to facilitate the mapping of the various components of this system.

Click to enlarge Click on the picture to enlarge it.

There was a discussion on the digitizer. The group agreed that the specification was unclear on the utility of the digitizer device - and that the remote controller was a sufficient device for the digitization process. As it didn't seem to add to a better understanding of the picture, the digitizer was crossed out from the mapping document.

Mohammad's query - "how does the robot talk to MCS" leads to a very good dialogue. There was some disagreement on communication between MCS and the Robot. Jesper held the view that all communication would be through the motion control card - whereas the moderator (Torsten) argued that the robot would have some way of directly notifying MCS of its type.

There was some discussion on Traverse Alignment. Configuration issues were also dealt with at this point. It was assumed that this was being taken care of.
There was a discussion on what was to be done with the model diagram and the group agreed that it was sufficient in its current form. Joseph had a query on where zones were being handled - the group clarified that this would happen in the MCS block. Christian/Jesper pointed out that MCS would need to have a model of the world (panel).

The group proceeded to brainstorm a list of class names that might possibly be used.

Click to enlarge Click on the picture to enlarge it.

This was followed by a first-attempt at a class design for the Automatic Motion story.

Click to enlarge Click on the picture to enlarge it.

There was a lot of discussion about the current position and a panel holding a list of repair and exclusion zones. It was concluded that the current position did not belong in the panel. An automatic motion class would control the movement in the repair zone and account for the exclusion zones. It would do this by creating segments within the repair zone.

A suggestion was made to evolve the system by focusing on a small part of the functionality - namely automatic motion for a linear robot. On the lines of this suggestion, a first-evolution was drawn up.

Click to enlarge Click on the picture to enlarge it.

The group collaborated to refine this diagram and produce the next one.

A refined version of the above.

Click to enlarge Click on the picture to enlarge it.

This diagram was the basis for a lot of design discussion and debate. It was agreed that the Automatic Motion Manager (AMM) would have an instance of Head - which would hold its current position. This would interact with the MCC (motion card controller).  The AMM would also contain a panel - which is a collection of repair zones and/or exclusion zones. Segments would be created upon starting the motion manager - which would then be used to complete the motion.

The moderator pointed out that as this was a focused case of the linear robot, the "Automatic Motion Manager" ought to be renamed to "Automatic XY Robot Motion Manager" (which was done).

There was a lot of talk on "manual motion" as opposed to automatic and remote control motion. It was agreed that manual motion was within the process of using the remote controller to obtain zone definitions. The group decided to incorporate the rotary robot (or radial robot) in the next evolution.

Click to enlarge Click on the picture to enlarge it.

This evolution tried to has out the story on remote motion control, with an attempt at incorporating both the types of robots - linear and rotary (radial). The Segment class caused some confusion, but that was ultimately clarified-  that it referred to any continuous line or arc.

The moderator pointed out that dealing with the Rotary and Linear segments at this stage of the design was probably a breakthrough.

The Remote Control Interface would allow notification of events such as a modification of speed or direction. The Remote Motion Control would be rigged up with the appropriate SegmentFactory that would translate this input to the appropriate ending position.

An important topic of discussion was the issue of boundaries - who would handle the situation of the arm moving off the panel. It was agreed that the Remote Motion Control could be initialized with a panel and take care of it, or delegate to the panel to find out if a point was outside.

The mechanism of controlling the robot was hotly debated - with two versions coming up - one where the final position would be computed and the robot would be set in motion to reach it, and the other where small increments would be made at periodic intervals. Both options are represented in the stickies (see picture above).