So disappointed nice speaker but as the HomePod not available in more countries, even the Netherlands officially not available. And come on why not Spotify Connect???
Gonna buy a Sonos.
Gonna buy a Sonos.
They added support for 802.15.4 so that they can bridge Thread-based HomeKit devices to WiFi, not for any other purpose. Its maximum bandwidth is 250Kbps. Barely enough bandwidth at maximum to handle a single stream of mono-audio.
Can’t make it cordless yet. Give it a year for Gen 2
Should have been waterproof and wireless.
It’s a home speaker. You leave it at a specific area in your home. You don’t bring this when you go to a pool party or hiking.
I think it's Apple being a business and wanting to sell more product, but I also see that in setting up a smart home, its far easier to have devices set to and staying in specific rooms.
Should have been waterproof and wireless.
They added support for 802.15.4 so that they can bridge Thread-based HomeKit devices to WiFi, not for any other purpose.
Am I right in thinking that currentlty there aren't any Thread-based HomeKit devices e.g. light switches? And excuse my ignorance (new to HomeKit) do all current HomekKt devices need a hub at the moment? I have never been interested in having to have an additional bridge / hub / dongle per connected device so never really looked into it...
So I guess I'm asking two things:
1. Can Current Homekit devices connect to this without a hub / bridge?
2. Will Future Homekit devices still need a hub?
Yeah, homekit can connect to wifi/BT devices. So lets take an AppleTV. It's going to be sitting on your home network so it can talk to a Wifi based device (eg. LIFX bulbs/lights). Additionally, it has built in BT, so it can talk to BT devices that are within range (like an August lock).
The only times it needs a hub is when the individual devices being controlled aren't able to communicate on a protocol that the Apple TV supports. Take Phillips Hue devices for example. They talk to each other with zigbee, and the appletv has no zigbee support - so the Hue bridge acts as the middle man.
If some theoretical next gen version of Hue bulbs natively supported Thread - then you wouldn't need a Hue hub and can directly connect to them. This is already the case with some Echo devices which have built in zigbee - you can directly pair Hue bulbs to them without owning a Hue hub.
I’ve been saying since Apple discontinued Airport that Apple was heading towards a future where a WiFi router based home network was irrelevant.
Today, Apple implemented two of those necessary pieces. Thread networks devices without the need for a central “home network” and 5G provides speed and bandwidth not only matching that of a WiFi network but exceeding it.
The future of home networks is no home networks. Every device just talks to each other directly and where feasible and necessary, devices connect to the internet via their own embedded 5G chip.
June 2019:
It's for turning on and off bits for "smart" devices. It doesn't really need to do much at all. Stuart Cheshire knows what he's doing.
I guarantee the next version of the mini (or a portable alternative line) will rest on a MagSafe charger somewhere down the road.It’s a home speaker. You leave it at a specific area in your home. You don’t bring this when you go to a pool party or hiking.
I guess it is 👍🏼I hope this is the same USB-C power adapter as they sell for the iPhones!
Does he ?
I'm one of the developer team (most codes were written by me) that first implemented a full working 802.15.4 stack including beacon mode, based on an ATmega 8-bit MCU and a beta version of baseband transceiver, back in 2004. And I'm too god damn familiar with that spice of junk.
As I said, it's slow, clumsy, noise, and full of bugs in the IEEE spec itself. It's the device that proactively searching a join the network and there is zero control over the network forming scheme within the spec. The physical layer design put too much focus on DSSS and lacks versatility and some very basic requirements for *real* low power application. And the ill designed MAC SAP have hugh vulnerability that a misbehaved device can easily disband whole PAN via simple MLME command. And in terms of MLDE collision / congestion is a plain nightmare. And the worst of all, there isn't enough motivation in business to drive a powerful industrial alliance, like WiFi, to amend the bugged low layer spec when the whole development is still in hype stage, though TBH the Lehman Bros. incident also "helped" a lot in phasing out such amateur technology.
All these deemed the fate of this technology to be a niche and only for niche.
What device that needs to hit the battery specs of the typical zigbee device isn't running zigbee?
and they are totally free to choose any tech they wanted because they only officially support their hardware with their hub.
You have so many companies independently choosing zigbee
You're suggesting forced use cases, then saying the tech is bad because it doesn't meet a use case that nobody using the tech cares about. Take the first paragraph for example - you decided the battery spec "isn't viable" based on your own determination of how the network should be set up. Anyone actually using the tech is perfectly fine with having a distinction between mains powered devices and battery powered devices, where only mains powered devices are used as router nodes.Problem is, the so call "battery spec" isn't viable for the initial goal of this technology. The idea of ZigBee in pre-2004 release is ad-hoc mesh network; one can deploy nodes freely to any location and they will forming a mesh network to reach the farthest points. But in real implementations the current consumption of the very basic RX consume are too large to maintain the receiver on all the time (ironically, TX current is even lower than RX). Battery will just die in few days in you're trying to run a router node on battery, because you need the receiver on all the time so that it can act as a relay node. That's what the spec demanded.
This technology lacks any network-wide approach for time syncing. The so-called "beacon-mode" in 802.15.4 is nothing by a self-contradictory joke. Beacon can only be sent by the coordinator of a network and can only be received by the first tier nodes. And when a node is receiving beacon it's forced to be act as an end node. In other words, the spec itself is demanding you to make it a star-network and only star-network if you want to preserve power by TDMA approaches.
ZigBee Alliance forum had discussed about this for several times. I know because I was there, in all Open House events, and I've proposed several possible workarounds, including out-of-bound synchronizer signal and wake-on-RX design tweak in the PHY layer, but the final verdict was: find a way yourself.
So in the end we have what we can see today: a node can either passively submit trigger events like a remote controller, or working on a very long duty cycle like a sensor, or it must be plugged in wall socket outlet all the time.
Again this is not what the ZigBee Alliance wanted. We have ZDO, End nodes, Service Discovery and Profiles so that we can have interoperability between different hubs and devices, but now the venders just make it proprietary. They just discarded all the identities of this technology, and it literally makes no difference if they choose other proprietary digital modulation like TI CC1000 series.
In fact it's probably better if they make it that way because 802.15.4 is prone to 2.4GHz interference, as there is no any frequency hopping scheme within the spec. The DSSS baseband isn't really super helpful in real world, especially in environments that 2.4GHz band is congested by WiFi or PSTN wireless phones. To make it even worse: many SoC ZigBee modules just failed to achieve the required RX sensitivity. So we need 12~ db noise margin in real world deployment, and that would be a challenge for any distance longer than 60 feet, not counting the obstacles like walls and doors. This is a plain nightmare for indoor environment, because human body is a huge obstacle for 2.4GHz signal. It's insignificant for 100mW WiFi signal, but for 1mW 802.15.4 signal it's a completely different story.
That's because the marketing strategy we were taking at that time: to make ZigBee a cable replacement of RS232. We knew that ZigBee wouldn't sell if we just ship it with SDK, so we made it compatible to the UART AT-commands. About all the chip venders and SI were selling their kits this way at that time. This greatly reduced the development cost for device venders, that can cover the higher cost for the transceiver units.
Also, don't forget that Bluetooth LE is officially released in 2011 and the compliance certification in 2012. Before that there were no other mid-range low spec wireless communication technology based on 2.4GHz ISM band for commercial products. All the other competitors, including Bluetooth 3.0, were killed in 2008 Lehman Bros. incident.
you decided the battery spec "isn't viable" based on your own determination of how the network should be set up. Anyone actually using the tech is perfectly fine with having a distinction between mains powered devices and battery powered devices, where only mains powered devices are used as router nodes.
Nobody cares about allowing a battery powered device to act as a router node, because no matter how battery efficient of a protocol you picked there's no way to create a router node that uses so little power it can run of nothing more than a capacitor and the kinetic energy created by clicking the button itself.
If your argument is "it's not perfect, therefore it's trash" then so be it. But I don't think you can really make any case for it not doing what people want it to do. Try to ignore the overly technical details for a second and come up with a couple real world examples of things that it currently can't do because of poor design - but if it could do would present a real world benefit to Phillips Hue or Ikea Tradfri?
And no, it's not possible to run 802.15.4 based on kinetic energy. The required transmission power of RF signal is 1mW and the phase lock loop should be activated all the time during the whole network joining procedure, which will last for few hundreds milliseconds to tens of seconds, depending on the settings of the coordinator. If someone told you that a 802.15.4 button can be powered by button clicking, he is lying. That thing may actually powered by kinetic energy, but it's definitely not 802.15.4.
You're not the "insider" of this industrial. I am. And I'm telling you: consumers deserves better RF technology for the price they paid.
Right? OG Homepod anyone??It’s probably just a software and firmware update to add support?
there's an android app (hify, trial for 10mins) that needs to be running on a spare android phone lying around your house connected to the same wifi, that gives you spotify CONNECT for the homepods.Can this work with spotify or does it only work with apple music?
Does he ?
I'm one of the developer team (most codes were written by me) that first implemented a full working 802.15.4 stack including beacon mode, based on an ATmega 8-bit MCU and a beta version of baseband transceiver, back in 2004. And I'm too god damn familiar with that spice of junk.
As I said, it's slow, clumsy, noise, and full of bugs in the IEEE spec itself. It's the device that proactively searching a join the network and there is zero control over the network forming scheme within the spec. The physical layer design put too much focus on DSSS and lacks versatility and some very basic requirements for *real* low power application. And the ill designed MAC SAP have hugh vulnerability that a misbehaved device can easily disband whole PAN via simple MLME command. And in terms of MLDE collision / congestion is a plain nightmare. And the worst of all, there isn't enough motivation in business to drive a powerful industrial alliance, like WiFi, to amend the bugged low layer spec when the whole development is still in hype stage, though TBH the Lehman Bros. incident also "helped" a lot in phasing out such amateur technology.
All these deemed the fate of this technology to be a niche and only for niche.