Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The Voltage is not calculated,
So it is not some mathematically modellable function of the current and recent input signal?

it is simultaneously present and relational to other factors of the circuit.
What on earth does "simultaneously present and relational to other factors" mean? The output as it is updated on a computer screen is "simultaneously present" in various circuits leading to the screen and is "relational to" multiple activities moving all the way back to how you wiggle your mouse.

Meanwhile, the AGC voltage used at the input of whatever amplifier circuit is not merely a reflection of the volume at the current instant. It will be delayed in some way. If you want another example, AM demodulation uses capacitance to output only the envelope change representing AF wave but smooth out the RF wave. At any instant the capacitor is storing information, whether you like it or not.

Take a magnet to that bad boy and watch your needle be permanently ruined. Is it calculating data now? :rolleyes:
Needle.. needle.. what do you think AGC actually is? It's true that the AGC output can be used to drive a meter as well, but that's hardly its central purpose. Also, take a hammer to a computer and your computer is permanently ruined. I'm not actually sure what you're talking about any more.

I started off arguing a text book and omni-dictionary definition of computers created by many people with fancy letters after their name and many more people with fancy letters after their name backed me up.
There's your first mistake. The number of fancy postnominals I or they have is irrelevant because arguments by authority are irrelevant.

I and many other people couldn't care about a laymans definition
Then why are you debating it? We all pretty much agree on various science and engineering definitions of the term. My argument has always been (1) English is defined by the whole body of people who speak it, not by specialists; (2) "computer" has a different definition to the general public than it has to specialists; thus (3) a mousetrap is definitely not a computer, and an iPhone is not a computer despite sharing some properties of one.

The laypersons definition of a computer is up to the general public and in spirit of KnightWRX "That them computers are everywhere"
The spirit of KnightWRX's post was that the public are "ignorant" (was that the word used?) idiots, easily amazed and to be patronised.

All of your posts have been comments which show your lack of knowledge about science.
This has been a discussion about linguistics - the few things I've said about RF, computer science and old hardware haven't been a matter of huge contention afaict. If you're arguing that I lack knowledge in linguistic "science" then I would love to be corrected. Indicate what I have said that you think is wrong, and prove it to be wrong.

your 25 year old radio is not a computer.
By layman's standards, I'd absolutely agree. I want to understand what definition you are using and why my 25 year old radio does not comply.

[circuit diagram]
Is this missing some accompanying text? It's an audio filter using a standard 741 op-amp with instantaneous positive feedback varied with a pot (i.e. volume control). Negative feedback is used for the actual filtering. There's no historical data in any real sense stored by this circuit when active.
 
Radio makes no decisions based on state. (doesn't even have state by any informatics definition - a capacitor is acting as a filter, not a controllable delay element)

So it is not some mathematically modellable function of the current and recent input signal?


What on earth does "simultaneously present and relational to other factors" mean? The output as it is updated on a computer screen is "simultaneously present" in various circuits leading to the screen and is "relational to" multiple activities moving all the way back to how you wiggle your mouse.

Meanwhile, the AGC voltage used at the input of whatever amplifier circuit is not merely a reflection of the volume at the current instant. It will be delayed in some way. If you want another example, AM demodulation uses capacitance to output only the envelope change representing AF wave but smooth out the RF wave. At any instant the capacitor is storing information, whether you like it or not.


Needle.. needle.. what do you think AGC actually is? It's true that the AGC output can be used to drive a meter as well, but that's hardly its central purpose. Also, take a hammer to a computer and your computer is permanently ruined. I'm not actually sure what you're talking about any more.


There's your first mistake. The number of fancy postnominals I or they have is irrelevant because arguments by authority are irrelevant.


Then why are you debating it? We all pretty much agree on various science and engineering definitions of the term. My argument has always been (1) English is defined by the whole body of people who speak it, not by specialists; (2) "computer" has a different definition to the general public than it has to specialists; thus (3) a mousetrap is definitely not a computer, and an iPhone is not a computer despite sharing some properties of one.


The spirit of KnightWRX's post was that the public are "ignorant" (was that the word used?) idiots, easily amazed and to be patronised.


This has been a discussion about linguistics - the few things I've said about RF, computer science and old hardware haven't been a matter of huge contention afaict. If you're arguing that I lack knowledge in linguistic "science" then I would love to be corrected. Indicate what I have said that you think is wrong, and prove it to be wrong.


By layman's standards, I'd absolutely agree. I want to understand what definition you are using and why my 25 year old radio does not comply.


Is this missing some accompanying text? It's an audio filter using a standard 741 op-amp with instantaneous positive feedback varied with a pot (i.e. volume control). Negative feedback is used for the actual filtering. There's no historical data in any real sense stored by this circuit when active.
 
Your statement was, "With Dragon Dictate, I can speak and transcribe +130/minute." You were alluding to the iPhone being powerful because it can be used for continuous dictation, even though it cannot except as a peripheral. You did not assert, "With Dragon Dictate I can see a list of suggested words and perform the basic tasks of a word processor."

To tackle your new, different argument: yes, pocket thesauruses have existed for ages, and it's no stretch of the imagination to contemplate a pocket homophone lookup (of course, the iPhone isn't even doing the lookup: it's just acting as a terminal for a list of words from the server). The '80s were replete with basic "word processors" which were basically advanced typewriters and differentiated from the burgeoning home computer market. The public identified neither as a computer.
Nothing new really, as the dictation program in question happens to be one of hundreds of thousands of applications, all of which are run on a computer operating system, called iOS.

The only thing you've managed to tackle here are your own contradictions.

If you want to look at it that way round: a sip does not quench thirst.

For you, a glass half-full constitutes a 'sip.'
 
cmaier said:
Radio makes no decisions based on state. (doesn't even have state by any informatics definition - a capacitor is acting as a filter, not a controllable delay element)
Of course you're right that the typical capacitor (e.g. in a tuned LC) in a radio is modelled as a filter and not as a device storing a particular value using voltage. Indeed, it's impossible to fully represent the state of any analogue system at any particular instant - i.e. describe the system now in such a way that its future behaviour with any given input can be predicted - using the methods applicable to a synchronous digital system, e.g. finite state diagram.

As for decisions, my radio makes one obvious binary decision: whether to squelch. Whether you consider this as a decision based on state depends on how you're modelling the system.

Nothing new really, as the dictation program in question happens to be one of hundreds of thousands of applications,
Does anyone take the raw number of apps seriously as an indicator of the quality of the iPhone/iPad? The standard issue response to this is, "And how many fart apps?" Cool stuff like Dragon and the AR.Drone confirms to me that the iPhone is an excellent accessory, with an ecosystem capable of increasing availability to existing technology (there's very rarely anything new for the iPhone but often tech which the masses seemed not to know existed before now - that's what Apple's so good at delivering).

all of which are run on a computer operating system, called iOS.
What is it with Apple using Cisco names? I think I'll stick to "iPhone OS" until I can say "IOS" without someone wondering whether I'm talking about IOS or iOS. And I can only do that because the Cisco iPhone wasn't significantly marketed.

Both more than fit-for-purpose as embedded system operating environments go, of course.

For you, a glass half-full constitutes a 'sip.'
If you will recall, this is about a device being "semi-open". This doesn't actually mean "50%" in any quantifiable sense - the same way that a semi-automatic rifle is "sorta automatic" and a semi-formal dance is "sorta formal" but we don't objectively measure one half of something. I used the glass-half-full analogy to illustrate that something with a degree of some property does not have either zero or all of that property. I'm sorry that it wasn't clear enough for you.

To me, trying to program a iPhone is closer to a sip than a gulp of openness. Many people have the same complaint, and the argument has been hashed and rehashed across the web thousands of times.
 
What is Apple trying to show? The iPhone 4 has the same bad signal issues other phones have? This is not a typical Apple move -- they usually showcase the positives of their own products.
-Aaron-

But they are! They are positive their phones have the same antennae problems others do!
 
How did you manage to conclude that? ...Xserve is designed with general purpose local and network access in mind.
Just like an iPhone. There are dozens of iPhone apps that allow you to connect to the internet, and either or both download data or be a server for network data, using the same Unix BSD socket interface as does any linux server.
You can connect any USB keyboard.
So exactly why does the Xserve's ability to connect a USB keyboard make it a computer, but the iPhone 4's or iPad's ability to connect to almost any Bluetooth keyboard not?
There's an onboard GT 120.
And the iPhone has an onboard PowerVR SGX far more powerful then the 3D GFX on my PowerMac G3, and I can get a video cable for my iPhone from Apple. Is a PowerMac G3 running OS X 10.4 a computer?
Then there's Firewire, USB host, two gigabit Ethernet, and (gods bless them) even RS232.
Well OK, the i4 doesn't have Firewire. Neither does one of the MacBook models (is it still a computer?). But several vendors sell peripherals that use the iPhone's serial port. And for hotels with Ethernet and no wifi, I can carry an Airport Express and still connect to the net with my iPhone.
 
A computer can not be Analogue it can only be digital.
A very famous engineer once said: "there is really no such thing a digital". At the very bottom level, all those digital gates are really built out of analog transistors and statistical semiconductor junctions, and an engineer will never design really high performance or low power computers unless they remember that.
Digital does not mean it has a processor.
Absolutely true. The digital stuff usually has to at least be programmable and Turing complete (minus the infinite memory) to be called a computer or processor. Javascript is far more than just Turing complete, and I can both write and run programs in Javascript on my iPhone.
 
A very famous engineer once said: "there is really no such thing a digital". At the very bottom level, all those digital gates are really built out of analog transistors and statistical semiconductor junctions, and an engineer will never design really high performance or low power computers unless they remember that.

Absolutely true. The digital stuff usually has to at least be programmable and Turing complete (minus the infinite memory) to be called a computer or processor. Javascript is far more than just Turing complete, and I can both write and run programs in Javascript on my iPhone.

Computers are classified as analog or digital based on whether they internally represent values in quantized form. It is true that the underlying devices in a digital electronic computer are analog, but they do represent numbers in digital form - the wires either have enough charge on them to be a 1, or not enough in which case they represent a 0.

That said, true analog computers definitely exist, where the amount of charge on wires is not quantized to represent numbers.
 
Just like an iPhone.
An iPhone does not have dual gigabit network connectivity, nor bus expansion options, nor the built-in ability to connect to a general purpose display. So it's not "just like" an iPhone at all. An iPhone offers a limited set of local and network connectivity options, not sufficiently general to be identified as a computer.

There are dozens of iPhone apps that allow you to connect to the internet, and either or both download data or be a server for network data,
Only if Apple's happy about how that software uses the network (way worse before this year). And only if your server can cope with Apple's crippled notion of multitasking.

using the same Unix BSD socket interface as does any linux server.
The user doesn't care.

So exactly why does the Xserve's ability to connect a USB keyboard make it a computer,
It doesn't "make it" a computer, although the fact that I can plug in any standard USB keyboard sure helps. Can I pair any standard Bluetooth keyboard with a virgin iPhone now? [To answer my own question: Seems so. Excellent.]

And the iPhone has an onboard PowerVR SGX far more powerful then the 3D GFX on my PowerMac G3,
So? Look out the window, the resolution's awesome! Still not computer-generated, though.

and I can get a video cable for my iPhone from Apple.
Which doesn't actually give you a general-purpose display, does it?
 
An iPhone does not have dual gigabit network connectivity, nor bus expansion options, nor the built-in ability to connect to a general purpose display. So it's not "just like" an iPhone at all.

It doesn't? Seems to me it has both those things via the dock connector.

An iPhone offers a limited set of local and network connectivity options, not sufficiently general to be identified as a computer.

So the first Mac isn't a computer? The IBM PC? iPhone has better built-in network connectivity than those.
 
Wow, I can't believe the "what is a computer?" slap fight is still going on.

How about we all just let Veri stick with his/her preconceived notion that a computer looks something like this, and little else:

univacI19531.jpg


while the rest of us happily go about our "computing" business (email, Web, playing games, and lots of other stuff) on the little "non-computers" we carry around in our pockets.
 
Which doesn't actually give you a general-purpose display, does it?

Are you saying that the iPad's color 1024x768 display does not meet your standard for being a "general-purpose display" which is somehow a requirement for you calling something a computer?

I've had at least a half dozen MacOS and Windows laptop and desktop computers that did not meet that criteria. (Throw in an Apple II+ and Amiga as well, plus maybe a few IBM mainframe and DEC and HP minicomputers to which I've had access.) What do you call those?
 
Wow, I can't believe the "what is a computer?" slap fight is still going on.

I can't believe it either. Every so often, I come back to this thread to see whether any progress has been made. There has been very little. I grew up with the idea that a computer was any machine that could be programmed to take an input, process/store or otherwise manipulate it, then spit out something useful. I thought that was a good definition.

But this thread... it just keeps going! Never mind that the news that spawned this thread has moved on since then: https://www.macrumors.com/2010/08/0...or-antenna-performance-comparisons-from-site/
 
I thought you said you used to work with ECL.

Mostly CML, but same principal. Also CMOS, though. And ECL works too - I either charge the base enough to bias the base-emitter junction or I don't. As a result I raise output voltages above the reference or I don't. (In CML the reference is merely the inverse of the output). At no point does any wire represent a numerical value of anything other than 0 or 1. I cross the threshold or I don't.
 
It doesn't? Seems to me it has both those things via the dock connector.
Assuming pinouts.ru is correct, we have video out for watching videos (but not as a general purpose display output), a serial port (with a protocol as consistent as Microsoft's), and the USB pins allowing the system to act as a USB device (but not bus host, so it can't be regarded as an expansion bus). A frivolous incantation at the USB pins is also required to get charging started.

So the first Mac isn't a computer? The IBM PC? iPhone has better built-in network connectivity than those.
Even if your assertion about iPhone were true, nothing stated creates the logical consequence that Mac/IBM PC are not computers.

Now the iPhone has faster network connectivity to particular networks (as you'd expect for the year) but the original IBM PC had more supported options for connectivity (as you'd expect from a computer).

firewood said:
Are you saying that the iPad's color 1024x768 display does not meet your standard for being a "general-purpose display"
I'm going to go out on a limb and guess that no human being on this earth has worked in front of an iPhone screen for 8 hours a day for the past year. You know why? Not because it's insufficient resolution, but because it's physically way too small. The same cannot be said for any display from my Mac Plus to the classical home cinema (film projector, scavenged cinema seats and all - I wish I had been old enough to take part in building it!) enjoyed long ago in the family attic.

LagunaSol said:
Wow, I can't believe the "what is a computer?" slap fight is still going on.
Because it's a fun few minutes' distraction for apparently so many people and it might even cause people to learn stuff ;-). I guess MD has learnt a bit about analogue computing, firewood about Cray 1 instruction timings, and cmaier about descriptive linguistics. I've increased my knowledge of the ARM VFP copro in addressing firewood and contemplating asynchronous technology led me to read up more on AMULET, which many years ago I'd regrettably skipped over when doing some undergrad logic design course.
 
A
Now the iPhone has faster network connectivity to particular networks (as you'd expect for the year) but the original IBM PC had more supported options for network connectivity (as you'd expect from a computer).

My original IBM PC had NO network support. I had to buy a hayes modem to get it connected. It had no ethernet port, no token ring, no nuthin' unless you added it.

The iPhone has wifi, bluetooth, a 3G connection, and a dock connector built in.

re: your other point, iphones can output their display onto car screens now, so I'm not sure why you think it's limited to movies.
 
Now the iPhone has faster network connectivity to particular networks (as you'd expect for the year) but the original IBM PC had more supported options for connectivity (as you'd expect from a computer).

IIRC, after I upgraded the memory in my early Mac to 512k, it could do AppleTalk networking. But other using than external modems, I don't recall any other network connectivity options. Certainly no ethernet, no USB, no video output, no color, no boards in slots, etc.

There was a token ring network board available for the Apple II+, but I don't think it was a very popular or common option.

Far from a lot more options than an iPhone.

Assuming pinouts.ru is correct, we have video out for watching videos (but not as a general purpose display output)

There are several RDP and VNC apps in the Apps store that will display a full Mac or PC desktop on an external monitor (Wyse PocketCloud is one) via the iPhone video out cable. e.g. the exact same display you'd get hooking the monitor directly to your PC or Mac. That's in addition to movies, etc. What's not general purpose about that display capability?
 
My original IBM PC had NO network support. I had to buy a hayes modem to get it connected.
But it had, via an open hardware platform sanctioned by IBM, the ability to add mostly whatever network support you pleased. You could do this at least via the serial port (my original AT has one, anyway) or ISA cards. This could then be driven by whatever software you chose. It is this freedom to choose which has accompanied the public's perception of a computer. On an iPhone I have neither a full gamut of standard expansion options nor the ability for general purpose expansion.

When the Mac came out, one of the criticisms frequently raised was that it lacked the expandability of the IBM PC/AT. Jobs once again was trying to change the public's perception of a computer from something complex but highly configurable to an albeit nutritious ready-meal. But you could still connect via the lowly serial port to whatever networking peripheral you chose, using the software of your choice (Microphone, I think I used on my Mac Plus). The only host-side/peer-to-peer general purpose peripheral connection on the iPhone - the serial port on the dock - is, last I checked, not generally documented.

re: your other point, iphones can output their display onto car screens now, so I'm not sure why you think it's limited to movies.
I'm guessing they've always had the technical ability to: if there's a video out on the dock, all you need is to jailbreak, reach supervisor mode and program the hardware. Maybe you're referring to iPod Out. I'm not sure how this is implemented, but I don't see any indication that the current version of the OS openly provides a general purpose screen mirroring function (or similar) on the dock connector... or does it?

firewood said:
There are several RDP and VNC apps in the Apps store that will display a full Mac or PC desktop on an external monitor (Wyse PocketCloud is one) via the iPhone video out cable.
I'm looking for the iPhone being able to output (and appropriately tailor - not just EVERYTHING LARGE) its own UI to an external, locally connected display. Ideally it would be a display of quality comparable to the public's current perception of computer display, i.e. not TV/video out, but using what's available with current hardware would be a good start.
 
But it had, via an open hardware platform sanctioned by IBM, the ability to add mostly whatever network support you pleased. You could do this at least via the serial port (my original AT has one, anyway) or ISA cards. This could then be driven by whatever software you chose. It is this freedom to choose which has accompanied the public's perception of a computer. On an iPhone I have neither a full gamut of standard expansion options nor the ability for general purpose expansion.

When the Mac came out, one of the criticisms frequently raised was that it lacked the expandability of the IBM PC/AT. Jobs once again was trying to change the public's perception of a computer from something complex but highly configurable to an albeit nutritious ready-meal. But you could still connect via the lowly serial port to whatever networking peripheral you chose, using the software of your choice (Microphone, I think I used on my Mac Plus). The only host-side/peer-to-peer general purpose peripheral connection on the iPhone - the serial port on the dock - is, last I checked, not generally documented.

The dock port is documented. You may or may not have to pay licensing fees, but Apple does tell developers how to access the connector and build hardware for the connector. But you are moving the goalposts. The iPhone already has, fresh from the factory, more network connectivity than most "computers" had up until pretty recently. It also has a dock connector which third parties can and do use to provide additional expandability - they may not yet use this to provide additional networking, but if not that's mostly because there's little need thanks to the iPhone's multiple radios.


re: your screen mirroring addition - yes, this is Apple-sanctioned. It's a new feature in iOS 4.
 
The dock port is documented. You may or may not have to pay licensing fees, but Apple does tell developers how to access the connector and build hardware for the connector. But you are moving the goalposts.
Afaict the only thing the port gives you which is usable for general purpose I/O, even if you do buy an accessory licence thing, is the serial port. It may be that the chipset supports USB On-The-Go but this isn't official.

they may not yet use this to provide additional networking, but if not that's mostly because there's little need thanks to the iPhone's multiple radios.
I think it's because no-one would be interested today in networking at the speed of the serial port.

You also seem to imply that having some sort of publicly-documented peripheral expansion port is now a necessity to earn the name "computer." This, too, would leave out many things that people universally accept as computers.
Examples?

re: your screen mirroring addition - yes, this is Apple-sanctioned. It's a new feature, and BMW has, for example, announced it will support it in 2011 cars:
I see no implication that it's a general screen mirroring service, just an output of the iPod music playing UI. At first I thought this was via the video out, though the way it's integrating with the car UI it might be some screen remoting protocol. This suggests something more sophisticated. Am I missing something obvious in the SDK?
 
I'm looking for the iPhone being able to output (and appropriately tailor - not just EVERYTHING LARGE) its own UI to an external, locally connected display. Ideally it would be a display of quality comparable to the public's current perception of computer display, i.e. not TV/video out, but using what's available with current hardware would be a good start.

Who cares what you are looking for? There are many many things called computers that have nothing like the displays you want. And even what you call "a good start" will likely be an obsolete output devices in a couple decades (like the modified analog TV sets that were popular for use with Apple I's). But they will all still be for computers.

And why would any user want a touch UI mirrored on a desktop monitor? The opposite is far more useful (desktop UI displayed on an iPhone, a very popular app category for use by mobile sysadmins, BTW).

Your "definition", if you even have one now, is becoming more useless and uninteresting by the minute.
 
Computers may be analog. Not every electronic computer is digital.

And going a bit off-topic, not every computer is electronic. Some are mechanical.

Some are human. As someone mentioned, just a few decades ago there were people who were called "computers", since their job was to compute.

Back on topic, I built my first analog electronic computer from a kit around 1963. The next year, I "invented" a handheld calculator when I built an electronic sliderule using three pots and a meter in a tiny box.

Interestingly, the iPhone uses a combination analog-digital memory. Its flash memory is an MLC type... meaning multi-level cell... which uses four distinct analog voltages to hold two digital bits per cell.
 
Who cares what you are looking for?
You could have just said, "I don't think it's possible."

There are many many things called computers that have nothing like the displays you want.
A defining characteristic of a computer sat in front of a person for his enjoyment is a general purpose output device. The iPhone exposes nothing (officially) to enable such a device, afaik.

And even what you call "a good start" will likely be an obsolete output devices in a couple decades. (like the modified analog TV sets that were popular for use with Apple I's). But they will all still be for computers.
An iPhone connected to a keyboard and a TV set would be a lot closer to a computer in terms of public perception, although expectations about minimal performance have increased somewhat since the '80s and I'm not sure whether the resolution offered by a TV would be taken seriously. More significant then would be whether the UI is sufficiently general-purpose and the extent to which I'm restricted from doing what I want with the device.

And why would any user want a touch UI mirrored on a desktop monitor?
Yes, why would people want to see information presented with larger text and taking up more than a tiny proportion of their field of vision? Seriously?
Anyway, you're right, merely mirroring the iPhone UI on a larger screen isn't good enough for general-purpose computing.

The opposite is far more useful (desktop UI displayed on an iPhone, a very popular app category for use by mobile sysadmins, BTW).
In increasingly loudness of "uh-oh":
- Sysadmin by desktop GUI.
- Sysadmin by remoting to a desktop GUI.
- Sysadmin by remoting to a desktop GUI on a 3.5" thumb screen.

But, yes, the iPhone is a good graphical terminal. As is my VT220. Actually, remote vector graphics is a lot cooler than simply compressing bitmaps, but I doubt the user cares.

Your "definition", if you even have one now, is becoming more useless and uninteresting by the minute.
You talk in terms of processing power and limited pre-selected I/O options. I talk in terms of ability to operate and freedom to apply as the user wishes. The longer this thread goes on, the more I perceive the disconnect between how the tech thinks and how the public think, even while accepting the definitions of both in context.

kdarling said:
Interestingly, the iPhone uses a combination analog-digital memory. Its flash memory is an MLC type... meaning multi-level cell... which uses four distinct analog voltages to hold two digital bits per cell.
This is interesting - didn't the 8087 do similar? However, as cmaier's post indicates, the quantisation implies we're still dealing with a strictly digital device.

(Unless there's something else which makes it analogue.)
 
This is interesting - didn't the 8087 do similar?

That's true... forgot about that. Thanks!

However, as cmaier's post indicates, the quantisation implies we're still dealing with a strictly digital device.

*grin* Of course, every electronic digital computer is actually using analog voltages inside to hold its state. Or even mechanical levers and fulcrums in some cases.

Heck, who can forget mechanical delay-line memory? Or magnetic drums.

(Waiting for someone to also point out a photon-based cell or something. Perhaps crystalline based memory? --)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.