Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So disappointed nice speaker but as the HomePod not available in more countries, even the Netherlands officially not available. And come on why not Spotify Connect???
Gonna buy a Sonos.
 
They added support for 802.15.4 so that they can bridge Thread-based HomeKit devices to WiFi, not for any other purpose. Its maximum bandwidth is 250Kbps. Barely enough bandwidth at maximum to handle a single stream of mono-audio.

Exactly. I have a conbeeII stick that I use for zibgee so I can actually watch firmware update transfer rate. I mean,
i guess anyone can watch it in the Ikea app or Hue app, but you don't see the actual filesize of the firmware image - so when it takes 3 minutes to upload the firmware file you probably just assume the firmware file is a few MB's in size when it's more like 150kb
 
Can’t make it cordless yet. Give it a year for Gen 2

While I wouldn't put a wireless mini out of the realm of possibility for the future, I also think Apple wants people to buy them for each room, and having a wireless one would defeat that idea since it could be carried room to room. I think it's Apple being a business and wanting to sell more product, but I also see that in setting up a smart home, its far easier to have devices set to and staying in specific rooms.
 
I think it's Apple being a business and wanting to sell more product, but I also see that in setting up a smart home, its far easier to have devices set to and staying in specific rooms.

One of this product's main functions is to act as voice control for HomeKit. As such, it needs to know not just where you are (hence the U1 chip), but where it is ("Siri turn off the ceiling fan in here"). While having an outdoor version would be nice at some point (for the deck or patio), I am not clear on how someone would use a batter powered version. If one wants to take it to the park or beach as has been suggested, this is the wrong product. There be no "room" for it to characterize to adjust the audio, and it would need another device to provide it a network connection, at which point why would one want a smart speaker rather than just use the other device (for intelligence). The majority of uses for a battery powered speaker are simply for streaming from a phone or tablet and those are best served by a cheap Bluetooth speaker.
 
  • Like
Reactions: masteroflondon
Should have been waterproof and wireless.

What is your use case? How much more would you be willing to pay for that functionality (assuming that you want all the current HomePod mini functionality, if not, you are not talking about the same product)? How big a battery and how long does it need to last? When you say "waterproof" do you mean "rain proof" (i.e. you could leave it out on your deck in a rain storm), actually "waterproof" (i.e. it could survive being dropped in the pool)?
 
They added support for 802.15.4 so that they can bridge Thread-based HomeKit devices to WiFi, not for any other purpose.

Am I right in thinking that currentlty there aren't any Thread-based HomeKit devices e.g. light switches? And excuse my ignorance (new to HomeKit) do all current HomekKt devices need a hub at the moment? I have never been interested in having to have an additional bridge / hub / dongle per connected device so never really looked into it...

So I guess I'm asking two things:
1. Can Current Homekit devices connect to this without a hub / bridge?
2. Will Future Homekit devices still need a hub?
 
Am I right in thinking that currentlty there aren't any Thread-based HomeKit devices e.g. light switches? And excuse my ignorance (new to HomeKit) do all current HomekKt devices need a hub at the moment? I have never been interested in having to have an additional bridge / hub / dongle per connected device so never really looked into it...

So I guess I'm asking two things:
1. Can Current Homekit devices connect to this without a hub / bridge?
2. Will Future Homekit devices still need a hub?


Yeah, homekit can connect to wifi/BT devices. So lets take an AppleTV. It's going to be sitting on your home network so it can talk to a Wifi based device (eg. LIFX bulbs/lights). Additionally, it has built in BT, so it can talk to BT devices that are within range (like an August lock).

The only times it needs a hub is when the individual devices being controlled aren't able to communicate on a protocol that the Apple TV supports. Take Phillips Hue devices for example. They talk to each other with zigbee, and the appletv has no zigbee support - so the Hue bridge acts as the middle man.

If some theoretical next gen version of Hue bulbs natively supported Thread - then you wouldn't need a Hue hub and can directly connect to them. This is already the case with some Echo devices which have built in zigbee - you can directly pair Hue bulbs to them without owning a Hue hub.
 
Yeah, homekit can connect to wifi/BT devices. So lets take an AppleTV. It's going to be sitting on your home network so it can talk to a Wifi based device (eg. LIFX bulbs/lights). Additionally, it has built in BT, so it can talk to BT devices that are within range (like an August lock).

The only times it needs a hub is when the individual devices being controlled aren't able to communicate on a protocol that the Apple TV supports. Take Phillips Hue devices for example. They talk to each other with zigbee, and the appletv has no zigbee support - so the Hue bridge acts as the middle man.

If some theoretical next gen version of Hue bulbs natively supported Thread - then you wouldn't need a Hue hub and can directly connect to them. This is already the case with some Echo devices which have built in zigbee - you can directly pair Hue bulbs to them without owning a Hue hub.

Thanks for the explainer. To check my understanding then..

Devices that don’t natively have wifi or BT (e.g. hue bulbs) have to have a bridge BUT if the device has wifi of BT it doesn’t need a bridge for HomeKit.

And in the future if a device has Thread built in it won’t need a bridge either?
 
HomePod is merely speakers that have two apps (HomeKit and Siri).

still quite boring.
 
I’ve been saying since Apple discontinued Airport that Apple was heading towards a future where a WiFi router based home network was irrelevant.

Today, Apple implemented two of those necessary pieces. Thread networks devices without the need for a central “home network” and 5G provides speed and bandwidth not only matching that of a WiFi network but exceeding it.

The future of home networks is no home networks. Every device just talks to each other directly and where feasible and necessary, devices connect to the internet via their own embedded 5G chip.

June 2019:

so explain to me ... until 5G becomes cost effectively cheap to all end users not on contract (pay as you go) and battery efficient chips and iPhones under heavy use last 48hrs on a single charge no top up ... where your internet connection for Siri will come from?

I recall long ago in the early Bluetooth SIG days a company called RedM created the first 300FT plus Bluetooth connected mesh network and it did so flawlessly.My how technology changes and this company probably no longer exists. My point, your vision is flawed and too far into the future with no real solutions offered yet to fuel That vision. Build it now or keep dreaming for it to happen.
 
It's for turning on and off bits for "smart" devices. It doesn't really need to do much at all. Stuart Cheshire knows what he's doing.

Does he ?

I'm one of the developer team (most codes were written by me) that first implemented a full working 802.15.4 stack including beacon mode, based on an ATmega 8-bit MCU and a beta version of baseband transceiver, back in 2004. And I'm too god damn familiar with that spice of junk.

As I said, it's slow, clumsy, noise, and full of bugs in the IEEE spec itself. It's the device that proactively searching a join the network and there is zero control over the network forming scheme within the spec. The physical layer design put too much focus on DSSS and lacks versatility and some very basic requirements for *real* low power application. And the ill designed MAC SAP have hugh vulnerability that a misbehaved device can easily disband whole PAN via simple MLME command. And in terms of MLDE collision / congestion is a plain nightmare. And the worst of all, there isn't enough motivation in business to drive a powerful industrial alliance, like WiFi, to amend the bugged low layer spec when the whole development is still in hype stage, though TBH the Lehman Bros. incident also "helped" a lot in phasing out such amateur technology.

All these deemed the fate of this technology to be a niche and only for niche.
 
  • Like
Reactions: ErikGrim
I suppose if you get board with it, you can play soccer in the backyard. It's the right size .. (or Siri ticks you off)
 
It’s a home speaker. You leave it at a specific area in your home. You don’t bring this when you go to a pool party or hiking.
I guarantee the next version of the mini (or a portable alternative line) will rest on a MagSafe charger somewhere down the road.
 
  • Like
Reactions: vipergts2207
Does he ?

I'm one of the developer team (most codes were written by me) that first implemented a full working 802.15.4 stack including beacon mode, based on an ATmega 8-bit MCU and a beta version of baseband transceiver, back in 2004. And I'm too god damn familiar with that spice of junk.

As I said, it's slow, clumsy, noise, and full of bugs in the IEEE spec itself. It's the device that proactively searching a join the network and there is zero control over the network forming scheme within the spec. The physical layer design put too much focus on DSSS and lacks versatility and some very basic requirements for *real* low power application. And the ill designed MAC SAP have hugh vulnerability that a misbehaved device can easily disband whole PAN via simple MLME command. And in terms of MLDE collision / congestion is a plain nightmare. And the worst of all, there isn't enough motivation in business to drive a powerful industrial alliance, like WiFi, to amend the bugged low layer spec when the whole development is still in hype stage, though TBH the Lehman Bros. incident also "helped" a lot in phasing out such amateur technology.

All these deemed the fate of this technology to be a niche and only for niche.

What device that needs to hit the battery specs of the typical zigbee device isn't running zigbee? You have so many companies independently choosing zigbee - and they are totally free to choose any tech they wanted because they only officially support their hardware with their hub. Take Philips for example, Hue could use whatever protocol they want because the protocol is invisible to the user since they only interact with the hub and how the hub interacts with the devices is irrelevant (to the user).
 
What device that needs to hit the battery specs of the typical zigbee device isn't running zigbee?

Problem is, the so call "battery spec" isn't viable for the initial goal of this technology. The idea of ZigBee in pre-2004 release is ad-hoc mesh network; one can deploy nodes freely to any location and they will forming a mesh network to reach the farthest points. But in real implementations the current consumption of the very basic RX consume are too large to maintain the receiver on all the time (ironically, TX current is even lower than RX). Battery will just die in few days in you're trying to run a router node on battery, because you need the receiver on all the time so that it can act as a relay node. That's what the spec demanded.

This technology lacks any network-wide approach for time syncing. The so-called "beacon-mode" in 802.15.4 is nothing by a self-contradictory joke. Beacon can only be sent by the coordinator of a network and can only be received by the first tier nodes. And when a node is receiving beacon it's forced to be act as an end node. In other words, the spec itself is demanding you to make it a star-network and only star-network if you want to preserve power by TDMA approaches.

ZigBee Alliance forum had discussed about this for several times. I know because I was there, in all Open House events, and I've proposed several possible workarounds, including out-of-bound synchronizer signal and wake-on-RX design tweak in the PHY layer, but the final verdict was: find a way yourself.

So in the end we have what we can see today: a node can either passively submit trigger events like a remote controller, or working on a very long duty cycle like a sensor, or it must be plugged in wall socket outlet all the time.


and they are totally free to choose any tech they wanted because they only officially support their hardware with their hub.

Again this is not what the ZigBee Alliance wanted. We have ZDO, End nodes, Service Discovery and Profiles so that we can have interoperability between different hubs and devices, but now the venders just make it proprietary. They just discarded all the identities of this technology, and it literally makes no difference if they choose other proprietary digital modulation like TI CC1000 series.

In fact it's probably better if they make it that way because 802.15.4 is prone to 2.4GHz interference, as there is no any frequency hopping scheme within the spec. The DSSS baseband isn't really super helpful in real world, especially in environments that 2.4GHz band is congested by WiFi or PSTN wireless phones. To make it even worse: many SoC ZigBee modules just failed to achieve the required RX sensitivity. So we need 12~ db noise margin in real world deployment, and that would be a challenge for any distance longer than 60 feet, not counting the obstacles like walls and doors. This is a plain nightmare for indoor environment, because human body is a huge obstacle for 2.4GHz signal. It's insignificant for 100mW WiFi signal, but for 1mW 802.15.4 signal it's a completely different story.

You have so many companies independently choosing zigbee

That's because the marketing strategy we were taking at that time: to make ZigBee a cable replacement of RS232. We knew that ZigBee wouldn't sell if we just ship it with SDK, so we made it compatible to the UART AT-commands. About all the chip venders and SI were selling their kits this way at that time. This greatly reduced the development cost for device venders, that can cover the higher cost for the transceiver units.

Also, don't forget that Bluetooth LE is officially released in 2011 and the compliance certification in 2012. Before that there were no other mid-range low spec wireless communication technology based on 2.4GHz ISM band for commercial products. All the other competitors, including Bluetooth 3.0, were killed in 2008 Lehman Bros. incident.
 
Problem is, the so call "battery spec" isn't viable for the initial goal of this technology. The idea of ZigBee in pre-2004 release is ad-hoc mesh network; one can deploy nodes freely to any location and they will forming a mesh network to reach the farthest points. But in real implementations the current consumption of the very basic RX consume are too large to maintain the receiver on all the time (ironically, TX current is even lower than RX). Battery will just die in few days in you're trying to run a router node on battery, because you need the receiver on all the time so that it can act as a relay node. That's what the spec demanded.

This technology lacks any network-wide approach for time syncing. The so-called "beacon-mode" in 802.15.4 is nothing by a self-contradictory joke. Beacon can only be sent by the coordinator of a network and can only be received by the first tier nodes. And when a node is receiving beacon it's forced to be act as an end node. In other words, the spec itself is demanding you to make it a star-network and only star-network if you want to preserve power by TDMA approaches.

ZigBee Alliance forum had discussed about this for several times. I know because I was there, in all Open House events, and I've proposed several possible workarounds, including out-of-bound synchronizer signal and wake-on-RX design tweak in the PHY layer, but the final verdict was: find a way yourself.

So in the end we have what we can see today: a node can either passively submit trigger events like a remote controller, or working on a very long duty cycle like a sensor, or it must be plugged in wall socket outlet all the time.




Again this is not what the ZigBee Alliance wanted. We have ZDO, End nodes, Service Discovery and Profiles so that we can have interoperability between different hubs and devices, but now the venders just make it proprietary. They just discarded all the identities of this technology, and it literally makes no difference if they choose other proprietary digital modulation like TI CC1000 series.

In fact it's probably better if they make it that way because 802.15.4 is prone to 2.4GHz interference, as there is no any frequency hopping scheme within the spec. The DSSS baseband isn't really super helpful in real world, especially in environments that 2.4GHz band is congested by WiFi or PSTN wireless phones. To make it even worse: many SoC ZigBee modules just failed to achieve the required RX sensitivity. So we need 12~ db noise margin in real world deployment, and that would be a challenge for any distance longer than 60 feet, not counting the obstacles like walls and doors. This is a plain nightmare for indoor environment, because human body is a huge obstacle for 2.4GHz signal. It's insignificant for 100mW WiFi signal, but for 1mW 802.15.4 signal it's a completely different story.



That's because the marketing strategy we were taking at that time: to make ZigBee a cable replacement of RS232. We knew that ZigBee wouldn't sell if we just ship it with SDK, so we made it compatible to the UART AT-commands. About all the chip venders and SI were selling their kits this way at that time. This greatly reduced the development cost for device venders, that can cover the higher cost for the transceiver units.

Also, don't forget that Bluetooth LE is officially released in 2011 and the compliance certification in 2012. Before that there were no other mid-range low spec wireless communication technology based on 2.4GHz ISM band for commercial products. All the other competitors, including Bluetooth 3.0, were killed in 2008 Lehman Bros. incident.
You're suggesting forced use cases, then saying the tech is bad because it doesn't meet a use case that nobody using the tech cares about. Take the first paragraph for example - you decided the battery spec "isn't viable" based on your own determination of how the network should be set up. Anyone actually using the tech is perfectly fine with having a distinction between mains powered devices and battery powered devices, where only mains powered devices are used as router nodes.

Nobody cares about allowing a battery powered device to act as a router node, because no matter how battery efficient of a protocol you picked there's no way to create a router node that uses so little power it can run of nothing more than a capacitor and the kinetic energy created by clicking the button itself.

If your argument is "it's not perfect, therefore it's trash" then so be it. But I don't think you can really make any case for it not doing what people want it to do. Try to ignore the overly technical details for a second and come up with a couple real world examples of things that it currently can't do because of poor design - but if it could do would present a real world benefit to Phillips Hue or Ikea Tradfri?
 
  • Like
Reactions: ErikGrim
you decided the battery spec "isn't viable" based on your own determination of how the network should be set up. Anyone actually using the tech is perfectly fine with having a distinction between mains powered devices and battery powered devices, where only mains powered devices are used as router nodes.

It's not "I" that decided how a network should be setup; it is the initial goal of this technology. This standard was designed to do what I said as its sole purpose, and it failed.

Without the ad-hoc mesh feature, 802.15.4/ZigBee is nothing but a low-speed RF transceiver. What it can do, the Bluetooth/LE, UWB, WiSUN (a "modern" reversion of 802.15.4 dedicated for sub-G channel and energy network), as well as all the other ISM band proprietary RF technology like TI, Microchip and Atmel, can do it faster, cheaper, more distance, more reliable, and consume even less resource.

We're not intended to make it a low-end solution for UART-replacement, but it is how the market accepted it.


Nobody cares about allowing a battery powered device to act as a router node, because no matter how battery efficient of a protocol you picked there's no way to create a router node that uses so little power it can run of nothing more than a capacitor and the kinetic energy created by clicking the button itself.

The US DoD cares. They're the initial sponsor of wireless sensor network research projects, and the goal of these researches was to make a versatile mesh network, that can automatically form a network and find redundant routes all by itself. They're designed this way so that the network can still be functional in case some nodes were destroyed. And they're designed to run on battery so that they can be fast deployed in places that the traditional wired/wireless high speed network can't reach. For 80% codes of this protocol stack is handling this issues (I know that because I coded them), and now you're saying nobody cares about it. That's why it's a failure.

And no, it's not possible to run 802.15.4 based on kinetic energy. The required transmission power of RF signal is 1mW and the phase lock loop should be activated all the time during the whole network joining procedure, which will last for few hundreds milliseconds to tens of seconds, depending on the settings of the coordinator. If someone told you that a 802.15.4 button can be powered by button clicking, he is lying. That thing may actually powered by kinetic energy, but it's definitely not 802.15.4.


If your argument is "it's not perfect, therefore it's trash" then so be it. But I don't think you can really make any case for it not doing what people want it to do. Try to ignore the overly technical details for a second and come up with a couple real world examples of things that it currently can't do because of poor design - but if it could do would present a real world benefit to Phillips Hue or Ikea Tradfri?

There is literally ZERO technical benefit in these products other than PROLONGING THEIR OUTDATED CODEBASE, by adapting a castrated subset of AT-command APIs. The R&D HQ is happy because they don't need to learn new technology. The HR department is happy because they can further cut-ff the R&D personnel. And the PM is even happier because they can make a unbelievable bargain from the chip venders, since no one else wants them.

Locking the customers in their own ecosystem is only a side-effect, because the administrator level have zero idea about "compatibility" in the beginning. They don't even have any slight idea about what it is.

You're not the "insider" of this industrial. I am. And I'm telling you: consumers deserves better RF technology for the price they paid.
 
And no, it's not possible to run 802.15.4 based on kinetic energy. The required transmission power of RF signal is 1mW and the phase lock loop should be activated all the time during the whole network joining procedure, which will last for few hundreds milliseconds to tens of seconds, depending on the settings of the coordinator. If someone told you that a 802.15.4 button can be powered by button clicking, he is lying. That thing may actually powered by kinetic energy, but it's definitely not 802.15.4.



You're not the "insider" of this industrial. I am. And I'm telling you: consumers deserves better RF technology for the price they paid.

Nobody needs to tell me it can be powered by Kinetic energy. I can see it right in front of me with the Hue tap switch I have sitting on my desk. Additionally I have 2 of these (Click for Philips Hue - RunLessWire).

I don't think being an insider helps you here. An insider of any project can list a litany of things that were left on the table or 'cut for time' while lamenting that the final product wasn't all it could have been. I don't think there's anyone who worked on any code base were the final result was 100% of the envisioned result. The difference is just how invested people where in the specific things that got cut. I think it stops you from actually accessing the practical implementations. I have devices that don't even need a battery; I can't find any mainstream non-zigbee device that can do that.
 
Last edited:
It’s probably just a software and firmware update to add support?
Right? OG Homepod anyone??

Can this work with spotify or does it only work with apple music?
there's an android app (hify, trial for 10mins) that needs to be running on a spare android phone lying around your house connected to the same wifi, that gives you spotify CONNECT for the homepods.

Never tried it, but surely worth the shot. Check for more info here!

Does he ?

I'm one of the developer team (most codes were written by me) that first implemented a full working 802.15.4 stack including beacon mode, based on an ATmega 8-bit MCU and a beta version of baseband transceiver, back in 2004. And I'm too god damn familiar with that spice of junk.

As I said, it's slow, clumsy, noise, and full of bugs in the IEEE spec itself. It's the device that proactively searching a join the network and there is zero control over the network forming scheme within the spec. The physical layer design put too much focus on DSSS and lacks versatility and some very basic requirements for *real* low power application. And the ill designed MAC SAP have hugh vulnerability that a misbehaved device can easily disband whole PAN via simple MLME command. And in terms of MLDE collision / congestion is a plain nightmare. And the worst of all, there isn't enough motivation in business to drive a powerful industrial alliance, like WiFi, to amend the bugged low layer spec when the whole development is still in hype stage, though TBH the Lehman Bros. incident also "helped" a lot in phasing out such amateur technology.

All these deemed the fate of this technology to be a niche and only for niche.

So Thread is merely an IP based protocol and can totally be supported by the hardware inside a homepod (og), and Apple just does not let that happen (yet)?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.