In spring 1998, 29 April to 26 May, the department of Teleinformatics invited some speakers to hold lectures. It should all be presented on the MBone and especially to the other sites in Graduate School of Teleinformatics. We had a full duplex 8 Mbps link to HKR, but had to go over SUNET to MH. We could not rely on multicast to work reliably on all sites at the same time. So I decided to go for unicast to the remote sites. The network looked like shown in Figure 1 at the time.
The link between magda-gw and inner-gw still consisted of two one-way
10BaseT giving almost 10 Mbps throughput. The link between inner-gw and
it-7513-gw was a 100 Mbps, although not dedicated for the seminars I
considered it no risk this could be any bottleneck in the system.
The HKR-link could cope with 8 Mbps and MH's access to SUNET was only
2 Mbps shared with the rest of traffic to/from MH. The link to MH
was a clear bottleneck, but out of our control.
Considering the capacity of the sending machines and that we used the
H.261 Codec the maximum total bandwidth use would probably be between
6 to 7 Mbps full duplex on the link between inner-gw and magda-gw.
To get maximal quality connections to the Gradschool sites, we should
utilize maximum of the available bandwidth between inner-gw and
magda-gw, which meant we had no room left for MBone traffic.
So I put golden-gate, a machine on a subnet inner-gw, as MBone
transmitter.
There were some other subnetworks connected to inner-gw that is not shown
in
Figure 1, but they should not normally contribute any traffic on
the link between magda-gw and inner-gw or any of the other mentioned
bottlenecks.
Also to get the best quality possible, I decided to use separate machines for video transmission and reception, skutt and dumle for transmission and humle for reception. The reason for this was to take advantage of the dedicated link to HKR. humle was an SGI O2 with an ability to grab a part of the screen and dump it on a composite video out port. In that way we could combine the two received video transmissions from MH and HKR as we wished, for display on a projector screen or TV in the lecture room. The flow plan looks like in Figure 2. The reason why I didn't want to split audio-out/in to separate machines was mainly because we only had one 10 Mbps Ethernet to humle, and expected to get a lot of incoming video traffic from the remote sites. Another reason was that the program we used was not designed for asymmetric use and didn't want to get any acoustic loopback effects caused by this, however improbable.
The Lectures were held in the lecture room C2, already equipped for audio
and video communications during the Sweden@Stanford-escapade of 1996, but
it needed a bit of freshing up and it took a few days to check it.
Nearby, we had a video room which also was a leftover from the same
project. In this room we had an audio mixer and a video mixer and analog
connection to the equipment in C2. There was also facilities for
computer networks since the magda-gw router was located there.
We had an initial plan for the analog feeds that is shown in
Figure 3.
We used different audio-out ports for HKR (Send 4) and MH (Send 3) to avoid sending their own audio back to them. In the connection to C2 the following channels were used:
At the testing with remote sites a strong humming sound was identified
and we found that it was caused when using an SGI O2 as audio transmitter
and an SGI Indy as audio receiver, so
we decided to move the audio traffic to/from MH to a nearby SS10,
called jerker.
The resulting new flow plan is shown in
Figure 4 and the new analog plan is shown in
Figure 5.
A word of advice for the future is to never use an O2 as transmitter
and a SGI as receiver when using the Robust Audio Tool (rat v. 3.0.23)
from UCL. The fenomenon has also been encountered when using a SGI O2
as audio receiver.
In C2 we had a studio lighting equipment to allow data projection at the
same time as we got an acceptable quality of the camera capture, as
long as the lecturer could be persuaded to avoid leaving the podium.
We had audience microphones to allow recording of questions from the
audience, but it was difficult to make them use the microphones in the
heat of debate.
We first used a video projector to show the remote participants, but
since the picture became too big and was located in the front of the
room every movement of the remote participants distracted the audience
from the lecturer. We also learned that it is necessary to pan the camera
now and then to show the remote participants the location of their
projection so they can act in accordance with their position in the
room.
Some remote participants got the typical TV-behaviour (I see them,
they don't see me) and was falling asleep and poking their noses
and doing other distracting things while shown in a very big image
in the front of the room.
Another typical TV-behaviour was that we never got any questions from
the remote sites, even if they got a specially asigned time slot for
this in the agenda.
The following seminars, we used an ordinary TV at the side of the
room instead.
A video projector might be used if the remote audience is big
and the projection is located at the side of the room.
The plan was to distribute slides to the remote sites before the lectures and then show every new slide for a short while in the video stream so the remote participants could follow the presentation. This because the video quality was not high enough to enable the remote participants to read the slides on the screen, we didn't have facilities for using a distributed whiteboard at the remote sites and we couldn't distribute presentation control information from the lecturers' PCs. But, the first presentation was copyrighted so we were not allowed to distribute it, some other presentations contained animations and sounds and the standard last-minute changes that couldn't be distributed. Besides, if the lecturer can resist the urge to run in front of the slide projection and point at it with his hands or a pointer stick he surely will point to the things he's talking about with the mouse pointer on his PC which ofcourse have about the same colour as the slide. A word of advice is to ensure that you never leave a pointer stick in the vicinity and that the mouse pointer is big enough and of a different colour than the slides. Then it might show on the video to the remote sites and on the overhead projection too.
As it were, the remote audience became fewer and fewer and the last lecture had no remote participants. Since we also recorded the lectures and made them available on the WWW together with the slides, although the video was in a very small format, I guess there was no motivation to sit and listen for 1.5 hours without being able to see the slides. It would be interesting to compare the ratio of users of the web materials to the number of remote participants and the number of MBone participants. The only confirmed usage data we have are from those who have contacted us about the streamed material on the web and the number of remote participants. Four persons have contacted us about the streamed material; three because they couldn't attend the real lecture (time zone difference, other meetings) and one wanted to see the lecture again. The number of "passive" Web users and the number of MBone viewers are unknown.
Since the people in MH and HKR have shown very little interest for the seminars, we should investigate why it is so. Maybe we should try to involve them more in who to invite, maybe even distribute the organization and the seminar over all the sites. Also, if we want more people from outside the Graduate School of Teleinformatics to participate and promote our department, then we should give some seminaries that are more oriented towards popular science and research overviews.
Maintained by Tobias Öbrink