Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mectojic

macrumors 65816
Original poster
Dec 27, 2020
1,360
2,592
Sydney, Australia
Goal: Get into a video call and share video, using a USB 1.0/1.1 webcam (ideally) or iSight/firewire/USB 2.0.

I know people have successfully brought instant messaging to PPC, through a variety of services. Most of these services (Discord, Google, Skype, Telegram, Facebook) have a video calling component; and iChat and other legacy software supported video calling.

So can we go the next step? Is there any way to make a video call in PPC in 2022?
 
  • Like
Reactions: netsrot39
iSight/FireWire/USB 2.0 would need to be the standard, I'd wager. For starters it allow the opportunity of HD devices.
I have a vintage USB 1.1 cam, which is what I ideally want to use in the end. For a particular vintage appeal. (Period-correct hardware for a 1999 G4)
 
I have a vintage USB 1.1 cam, which is what I ideally want to use in the end. For a particular vintage appeal. (Period-correct hardware for a 1999 G4)

I see.

That's rather limiting in terms of the potential for the sake of period-correct nostalgia but horses for courses. Anyhow it'll be interesting to see how this develops. :)
 
  • Like
Reactions: Amethyst1
One problem besides codecs as appointed by @Dronecatcher it's the interface.

Many argued in the past about USB vs FireWire resources load on the system. Theoretically FireWire has an advantage because it "allows high-level protocols to run without loading the host CPU with interrupts and buffer-copy operations" witch can be more performative and reduces the load on the host system.

Why many audio interfaces was built with FireWire port for professional use in the past? USB at that time was an "inferior" with the lack of standard between manufacturers with UHCI OHCI until the xHCI (USB3.0).

I didn't try personally and measured it the impact on my systems, but that's something that I'm interested in doing in the near future.

So if you add the CPU (and system overall) load from processing/acquiring the image + the codecs(audio/video) the system should be really something considerably powerful.

Unless using some low FPS (probably 20 or less), restrict the resolution, and use the appropriate codecs that will take advantage of the system... That's a hell of complicated project to make.
 
The problem will be the video codecs major platforms use - this will be unworkable to all but the fastest G5s.
Alright, how about we start with an easier challenge- has anyone been able to do video conferencing on a G5 PPC in 2022?

I do know that ActionRetro managed to stream himself to Twitch on a G5; that was with Linux. I’m not against a Linux solution, but would prefer success in OS X.
 
I Guess we can do videoconference locally using 2 firewire videocameras with a firewire net.
But that is more like a fixed dúplex communication than a call system.

A G4 period Sony Digital8 could be cheaper than a iSight... And the quality is better if you can deal with DV-Codec that is the streamed thru firewire códec I Guess.

Mpeg 2 or div-x could be candidates instead of DV if we can wrap codecs
 
Last edited:
  • Like
Reactions: dextructor
One problem besides codecs as appointed by @Dronecatcher it's the interface.

Many argued in the past about USB vs FireWire resources load on the system. Theoretically FireWire has an advantage because it "allows high-level protocols to run without loading the host CPU with interrupts and buffer-copy operations" witch can be more performative and reduces the load on the host system.

Why many audio interfaces was built with FireWire port for professional use in the past? USB at that time was an "inferior" with the lack of standard between manufacturers with UHCI OHCI until the xHCI (USB3.0).

I didn't try personally and measured it the impact on my systems, but that's something that I'm interested in doing in the near future.

So if you add the CPU (and system overall) load from processing/acquiring the image + the codecs(audio/video) the system should be really something considerably powerful.

Unless using some low FPS (probably 20 or less), restrict the resolution, and use the appropriate codecs that will take advantage of the system... That's a hell of complicated project to make.
Second this.

There was a time where I was using a USB cam with Skype. In general, CPU was pegged at no less than 70% and would sometimes spike and hold around 100 percent. I had to sit in front of a naked light bulb blasting my face in order for the USB cam not to be dark either.

iSight solved all those problems.

So, assuming OP you do find some way - your bottleneck would then become your USB cam.
 
So, assuming OP you do find some way - your bottleneck would then become your USB cam.
I've recently been interested in the historical/technical aspect of early video conferencing, i.e. before iSight.

Apple in their wisdom didn't bother with that tech until it was feasible for the average consumer – i.e., iSight was only introduced once their entire Mac line was all 600MHz+, had 10/100 + AirPort Extreme standards, and Firewire built in. That's the Apple way.
But there were a ton of consumer webcams from 1999-2001 that used the USB 1 spec – cams designed for the iMac (pre-firewire), relying on G3s at 200-300MHz, at 10/100 or 56K. Some of these used proprietary software, others worked with emerging standards (Yahoo, AOL?). In that instance, sure the cam probably couldn't output much more than 240p at 10fps, but it was possible, and was done by many people in the day.

I suppose if just a single one of these old video services still worked, then there is a chance this project would work.
 
I've recently been interested in the historical/technical aspect of early video conferencing, i.e. before iSight.

Apple in their wisdom didn't bother with that tech until it was feasible for the average consumer – i.e., iSight was only introduced once their entire Mac line was all 600MHz+, had 10/100 + AirPort Extreme standards, and Firewire built in. That's the Apple way.
But there were a ton of consumer webcams from 1999-2001 that used the USB 1 spec – cams designed for the iMac (pre-firewire), relying on G3s at 200-300MHz, at 10/100 or 56K. Some of these used proprietary software, others worked with emerging standards (Yahoo, AOL?). In that instance, sure the cam probably couldn't output much more than 240p at 10fps, but it was possible, and was done by many people in the day.

I suppose if just a single one of these old video services still worked, then there is a chance this project would work.
You might literally have to create this in a lab then, using period software and workarounds to make it all work. Some of these services and protocols just do not exist anymore, let alone the servers that were controlling this stuff.

Sorry I can't be much of a help here. I only seriously go into Macs in 2003, about the end of this time period you're looking at.
 
  • Like
Reactions: TheShortTimer
@mectojic I think in principle it is possible with some tinkering and dedication.

Using macam I've been able to bring an old webcam back to life on Tiger - this also installs a driver/component that can be utilised by other software.

Using Apple's free Quicktime Broadcaster - which has numerous transcoding/broadcasting options I was able to create a live stream with audio - on the screenshot this is cam.mov which is not a video file but a QuickTime document that points to the live stream.

However, I could only get that to work on my Powerbook that was doing the broadcasting - if I saved the file on another networked Mac and played it from that machine it would time out - I think to work networked/online you need to be running the server edition of Tiger.

There are also URLs of the stream provided in QTB too but I couldn't get them to work either but in principle this method (if installed both ends) should give you video calling and with not too much CPU tax either.

broadcast.png
 
Last edited:
An update: I was able to successfully do a video call!

The method was the simplest that I basically knew would work: 2 Macs over iChat through Bonjour:
1) G4 AGP 500MHz, using iChat in 10.4.11 with iSight camera.
2) MacBook Air 2010, iChat under 10.6.8 / 10.7.5.

This is the easiest, basically requires no setup as iChat likes to work with itself. Of course Bonjour only works on the same network. This proves that video chat in some form is possible – this is also nice, because from the test it proved that iSight video from a 500MHz G4 with Airport Extreme works quite well with iChat, despite the minimum requirement being 600MHz.

So, while video chat is possible, all I can really do at this stage is video call my wife from another room. Also, that requires my wife to also use iChat, or at the least the Bonjour-supported Messages (I think the last was 10.12?) – and she won't want to do that. Still, this solution could work nicely as a baby monitor for future I suppose :D – little PPC baby being monitored by an iSight-equipped iMac G4 :)

Anyway, I think I can do better than that.

I'll keep messing around with Quicktime Broadcaster. Took a while but I got Tiger Server installed, in case it's needed.
 
I've played around with Mac OS X Server a bit over the years, and what you want to do here is definitely iChat using Server. It creates an XMPP/Jabber account for each user on the server, and can be used on the local network for text, audio, and video conferencing. However, as you said, using Bonjour does all of these things anyway! So, using Mac OS X Server at this point only has the added ability to connect over the internet, as opposed to the local network. Provided you could set up the proper DNS and DHCP configurations to make your Mac OS X Server act as a "real" server, (ie, connected and accessible to the outside world), you could use iChat as it was originally designed: adding people to your buddy list around the world.

I don't know what else is involved in this, since I never actually set up a wide area network server using Mac OS X Server, but it should be possible. That said, I was actually able to use the built in VPN server settings to connect outside of the house using one Mac logged into the server account for iChat, and another in-house Mac using their iChat credentials, and it worked to chat between them.

@Dronecatcher I also played around with Quicktime Broadcaster using Server a bit, and what you did is basically what I was able to achieve as well. I could however, make the video stream point to the Serve, and then see it on any connected Mac in the house by going to the server.url/stream url. I was also able to make this visible using my Raspberry Pi with Homebridge so I could make one of my PPC Macs' iSight cameras visible in HomeKit on my iPhone. :p However, I ran into a few snags using Quicktime Broadcaster, namely, I couldn't figure out how to make distinct streams for each user that was logged in using their Server account. So for example, "James" could stream, but only James, and only to a unique video steam url. I wasn't able to make it stream to server.com/james/stream, essentially. Lots to learn still, but I'm glad you were able to try it out as well!
 
  • Like
Reactions: G4fanboy
I can elaborate on what @Melchieor and I did to get audio/video chat work in iChat AV. We used standard iChat AV with standard Logitech webcams. Most modern Logitech webcams seem to still be well supported on Tiger and Leopard. We were both running on Sorbet. I was using a Logitech Brio 4k with a PowerMac G4 Gigabit via a USB 2 PCI card and OS X automatically put it into a lower resulution mod. No special drivers. It just worked.

Getting iChat to work was a success with some caveats. We both set up Jabber accounts via https://jabb.im. iChat AV in Leopard was able to connect to it with minimal effort and text chat worked perfectly.

To get audio/video chat working took some extra effort. Following this old blog post we both setup up port forwards for:
  • 5060 - UDP: SIP (Session Initiation Protocol)
  • 5190 - TCP & UDP: AIM/iChat file transfert
  • 5678 - UDP: SNATMAP server
  • 16384-16403 - UDP: RTP/RTCP (Real-Time Transport Protocol)
This, however, wasn't quite enough. When it failed to connect, I did a tcpdump on my router and saw that my local IP (192.168.20.21) was attempt to connect to @Melchieor's local IP (192.168.1.178). Obviously, there's no way that would work since the connection is traversing the Internet. It should have been going from my local IP to his WAN IP.

To test a work around I created separate NAT rules for each of the above ports that forwarded connections to his local IP as the destination (192.168.1.178) to his WAN IP instead. That worked! We successfully connected and have a video chat with decent (albiet a little grainy) quality.

Once we realized the problem, I started digging more into why iChat wasn't trying to initiate the connection between our WAN IPs. Again, this is where tcpdump comes in. Upon making a call, iChat makes an http request to http://configuration.apple.com/configurations/macosx/ichat/1/snatmap.txt.

That file contains the URL of Apple's SNATMAP server (snatmap://snatmap.apple.com:5678). SNATMAP is Apple's service for determining each party's WAN IP. The problem in this case is that back then, plain HTTP worked fine. Now, that URL gives a 301 redirect to HTTPS that iChat chokes on.

To work around this I set up my own web server and added an entry in my /etc/hosts file that points configuration.apple.com to it. Then I copied the snatmap.txt file onto my web server where iChat expected to find it. When running iChat in debug mode with "/Applications/iChat.app/Contents/MacOS/iChat -errorLogLevel 7" I could see that it no longer errors when querying configuration.apple.com.

After that, we tried a call again while monitoring traffic. In wireshark we did see UDP queries to snatmap.apple.com:5678 (this didn't happen previously) but it was still attempting to connect my local IP to @Melchieor's.

Where does this leave us? Short of reverse engineering how Apple's SNATMAP service works and figuring out what the response that iChatAV is looking for, I'm stuck. While we can successfully video conference, I was hoping to have a writeup on what to do without needing to create fancy NAT forwarding rules that forwards a mistaken local IP to the correct WAN IP. Needing to do that is limiting.

If anyone with more knowledge in this field is interested, I would love some help in taking this effort further.
 
I can elaborate on what @Melchieor and I did to get audio/video chat work in iChat AV. We used standard iChat AV with standard Logitech webcams. Most modern Logitech webcams seem to still be well supported on Tiger and Leopard. We were both running on Sorbet. I was using a Logitech Brio 4k with a PowerMac G4 Gigabit via a USB 2 PCI card and OS X automatically put it into a lower resulution mod. No special drivers. It just worked.

Getting iChat to work was a success with some caveats. We both set up Jabber accounts via https://jabb.im. iChat AV in Leopard was able to connect to it with minimal effort and text chat worked perfectly.

To get audio/video chat working took some extra effort. Following this old blog post we both setup up port forwards for:
  • 5060 - UDP: SIP (Session Initiation Protocol)
  • 5190 - TCP & UDP: AIM/iChat file transfert
  • 5678 - UDP: SNATMAP server
  • 16384-16403 - UDP: RTP/RTCP (Real-Time Transport Protocol)
This, however, wasn't quite enough. When it failed to connect, I did a tcpdump on my router and saw that my local IP (192.168.20.21) was attempt to connect to @Melchieor's local IP (192.168.1.178). Obviously, there's no way that would work since the connection is traversing the Internet. It should have been going from my local IP to his WAN IP.

To test a work around I created separate NAT rules for each of the above ports that forwarded connections to his local IP as the destination (192.168.1.178) to his WAN IP instead. That worked! We successfully connected and have a video chat with decent (albiet a little grainy) quality.

Once we realized the problem, I started digging more into why iChat wasn't trying to initiate the connection between our WAN IPs. Again, this is where tcpdump comes in. Upon making a call, iChat makes an http request to http://configuration.apple.com/configurations/macosx/ichat/1/snatmap.txt.

That file contains the URL of Apple's SNATMAP server (snatmap://snatmap.apple.com:5678). SNATMAP is Apple's service for determining each party's WAN IP. The problem in this case is that back then, plain HTTP worked fine. Now, that URL gives a 301 redirect to HTTPS that iChat chokes on.

To work around this I set up my own web server and added an entry in my /etc/hosts file that points configuration.apple.com to it. Then I copied the snatmap.txt file onto my web server where iChat expected to find it. When running iChat in debug mode with "/Applications/iChat.app/Contents/MacOS/iChat -errorLogLevel 7" I could see that it no longer errors when querying configuration.apple.com.

After that, we tried a call again while monitoring traffic. In wireshark we did see UDP queries to snatmap.apple.com:5678 (this didn't happen previously) but it was still attempting to connect my local IP to @Melchieor's.

Where does this leave us? Short of reverse engineering how Apple's SNATMAP service works and figuring out what the response that iChatAV is looking for, I'm stuck. While we can successfully video conference, I was hoping to have a writeup on what to do without needing to create fancy NAT forwarding rules that forwards a mistaken local IP to the correct WAN IP. Needing to do that is limiting.

If anyone with more knowledge in this field is interested, I would love some help in taking this effort further.
Would it be possible to have some server set up (either in your each home, or on the internet somewhere) that grabs the data from the iChat configuration URL and then gives it to the iChat computer instead of going directly to Apple?
 
Would it be possible to have some server set up (either in your each home, or on the internet somewhere) that grabs the data from the iChat configuration URL and then gives it to the iChat computer instead of going directly to Apple?
It would have to be an snatmap server. Since that is priority to Apple, it would have to be reverse engineered and built from scratch. That is beyond my capabilities.
 
  • Sad
Reactions: Slix and B S Magnet
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.