Since84
Moderator
To infinity and beyond!
Posts: 3,933
|
Post by Since84 on Apr 23, 2019 2:12:15 GMT -8
|
|
Ted
fire starter
Posts: 882
|
Post by Ted on Apr 23, 2019 7:16:40 GMT -8
I’ll take a stab at the Mac360 article... It all depends how Cook defines “primary technology” in the quote. Is an OLED screen primary tech? RAM? Switches and buttons? Or are those commodity products? I’d say that Apple having its own OSs, own SOCs and other proprietary chips, its own programming language, and its own apps, services, distribution networks, intellectual property and company approach to innovation comprise its primary tech - not memory modules, assemblers or aluminum smelting plants. Seems kinda obvious... EDIT: And PED seems to agree with me mostly. www.ped30.com/2019/04/23/apple-stack-primary-technologies/
|
|
chinacat
Moderator
AAPL Long since 2006
Posts: 4,426
|
Post by chinacat on Apr 23, 2019 9:29:11 GMT -8
|
|
JDSoCal
Member
Aspiring oligarch
Posts: 4,182
|
Post by JDSoCal on Apr 23, 2019 9:55:52 GMT -8
“For now, Apple's Face ID is a winning feature for many who can afford a premium iPhone around the world” Personally, I prefer Touch ID over Fave ID, which is why I returned my iPhone X. 🤷🏻♂️ Looks like 5G is all marketing hype with little payoff for the end user. I still have a problem with 4G coverage in my area.
|
|
Ted
fire starter
Posts: 882
|
Post by Ted on Apr 23, 2019 11:45:39 GMT -8
Three month AAPL chart. Uhh, wheee.
|
|
4aapl
Moderator
Posts: 3,631
|
Post by 4aapl on Apr 23, 2019 12:41:55 GMT -8
Three month AAPL chart. Uhh, wheee. LOL My wife was just looking at the 6 month chart, on her iPhone. It's the only one that is "down", and thus red. While AAPL has recovered a lot, it dropped peak t trough 39% vs S&P's 19%. And while the S&P is 0.2% off it's former high, AAPL is still 11% off. OTOH, if you look at that percent off the bottom, the S&P would be something like 24%, and AAPL is 46% off the bottom. So it might feel a little euphoric, even though AAPL is still more than 10% off the high.
|
|
|
Post by PikesPique on Apr 23, 2019 13:18:43 GMT -8
Personally, my portfolio, which includes AAPL but lots of other stuff too, is above its recent (Sep. '18) high, even though I've been drawing my retirement "paycheck" from it monthly. So, yeah, WHEE!
|
|
Ted
fire starter
Posts: 882
|
Post by Ted on Apr 23, 2019 13:37:30 GMT -8
Three month AAPL chart. Uhh, wheee. LOL My wife was just looking at the 6 month chart, on her iPhone. It's the only one that is "down", and thus red. While AAPL has recovered a lot, it dropped peak t trough 39% vs S&P's 19%. And while the S&P is 0.2% off it's former high, AAPL is still 11% off. OTOH, if you look at that percent off the bottom, the S&P would be something like 24%, and AAPL is 46% off the bottom.
So it might feel a little euphoric, even though AAPL is still more than 10% off the high. Yeh, I hear you. The steep climb back up has been encouraging and shows AAPL may be more resilient than in the past, but let's see where earnings puts us . . . to da moon or back down to the socks & undies dept...
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 23, 2019 13:38:50 GMT -8
I’ll take a stab at the Mac360 article... It all depends how Cook defines “primary technology” in the quote. Is an OLED screen primary tech? RAM? Switches and buttons? Or are those commodity products? I’d say that Apple having its own OSs, own SOCs and other proprietary chips, its own programming language, and its own apps, services, distribution networks, intellectual property and company approach to innovation comprise its primary tech - not memory modules, assemblers or aluminum smelting plants. Seems kinda obvious... EDIT: And PED seems to agree with me mostly. www.ped30.com/2019/04/23/apple-stack-primary-technologies/I agree with you and PED. All depends on what one means by the term 'primary'. They make the CPU (on an ARM base), which comprises the GPU and down the road a modem, their own security chips, AirPods chip, and software for everything. And yes, the Qualcomm struggle–and we've merely reached the end of the beginning of that–is a part of that. The piece states two reasons for making your own stuff: differentiation and interoperability; but there are other goods too. Among other things, there is product line integration. The latter is exactly what Apple is trying to do with project Marzipan. I have no idea whether they'll be successful or not, but if you essentially assemble modular components a la Dell it's just not feasible. But here's the thing. Jobs' idea about the sand thing was surely hyperbole. Exaggeration for the sake of making a point. Taken literally it's an absurd absolutist perspective. Simply replacing standard components with proprietary ones won't do anything of itself. They components you make must be made with an end goal in mind or it will do nothing for you except some crude and unsustainable forms of "lock-in". It's self-defeating overall, as with most absolute goals. You want to make the most essential parts, and not the rest. That's why Jobs also cancelled several Apple proprietary technologies and replaced them with standards. Apple Desktop Bus with USB is the most obvious example. Making everything yourself would be self-defeating even if it were possible. Especially since some early technologies are or turn out to be transitional and are quickly replaced with something else. Better let the marketplace bear the cost of that churn, which are also opportunities for small fast moving businesses to make a killing too. And even if not transitional, better to have a few years to develop your own in the best way rather than rushing. I'm a sucker for tech industry history, and just an hour ago I started the book "IBM 360 and Early 370 Systems". In the introduction it says: "The present volume shows how the decision to displace existing computers [with a software unified line that could run same apps] was linked with two other far-reaching decisions, one to develop and use entirely new circuit components, the other to undertake the manufacture of essentially all needed circuit components within the company. A consequence of the decision to transform IBM from the leading buyer to the largest manufacturer of semiconductor devices. The considerations that led to these bold decisions, and the technical and managerial difficulties encountered on the way to their fulfillment, form the core of the present volume."
|
|
4aapl
Moderator
Posts: 3,631
|
Post by 4aapl on Apr 23, 2019 20:45:35 GMT -8
LOL My wife was just looking at the 6 month chart, on her iPhone. It's the only one that is "down", and thus red. While AAPL has recovered a lot, it dropped peak t trough 39% vs S&P's 19%. And while the S&P is 0.2% off it's former high, AAPL is still 11% off. OTOH, if you look at that percent off the bottom, the S&P would be something like 24%, and AAPL is 46% off the bottom.
So it might feel a little euphoric, even though AAPL is still more than 10% off the high. Yeh, I hear you. The steep climb back up has been encouraging and shows AAPL may be more resilient than in the past, but let's see where earnings puts us . . . to da moon or back down to the socks & undies dept... Ehhhh, for better or worse I think we've had things surpass the "socks and undies dept", such that our recent 39% temporary decrease in AAPL shares was not a worry. OTOH, it wasn't that many years ago that things were compressed enough for us that I passed on the $15 worm drive skill saw at the thrift store, and the 3/4" shackles at the metal recycling surplus store. While I kick myself for those things a little more than I should, there of course are other things I wish I had done, not even counting those many "coulda shoulda woulda" stocks that did even better than AAPL (Yahoo was nice to point out Monster Energy Drinks that wasn't even on my radar....thanks!) There was a time where I was always optimistic on Apple's earnings. While I agree that this upcoming earnings and guidance will point the way of the stock, I no longer can feel 110% certain in the directionality. I'd give it a 80-90% chance that they will outdo expectations in earnings/guidance, the China/Trump Tariffs/whatever could always hit that. But the nice thing with being on the giant ship is that while it might not do well on the slalom, it still plows through any small waves (or boats) in it's path. Given the size and strength of Apple these days, I don't expect it to ever beat the market by full multiples (200%+) for any real length of time. But being in the 50-150% range of the S&P isn't that bad. Just the same, it gets bonus points if it can stay a bit or more above the 100% comparison over 2-5 years. Thanks Apple
|
|
4aapl
Moderator
Posts: 3,631
|
Post by 4aapl on Apr 23, 2019 21:09:22 GMT -8
I’ll take a stab at the Mac360 article... It all depends how Cook defines “primary technology” in the quote. Is an OLED screen primary tech? RAM? Switches and buttons? Or are those commodity products? I’d say that Apple having its own OSs, own SOCs and other proprietary chips, its own programming language, and its own apps, services, distribution networks, intellectual property and company approach to innovation comprise its primary tech - not memory modules, assemblers or aluminum smelting plants. Seems kinda obvious... EDIT: And PED seems to agree with me mostly. www.ped30.com/2019/04/23/apple-stack-primary-technologies/I agree with you and PED. All depends on what one means by the term 'primary'. They make the CPU (on an ARM base), which comprises the GPU and down the road a modem, their own security chips, AirPods chip, and software for everything. And yes, the Qualcomm struggle–and we've merely reached the end of the beginning of that–is a part of that. The piece states two reasons for making your own stuff: differentiation and interoperability; but there are other goods too. Among other things, there is product line integration. The latter is exactly what Apple is trying to do with project Marzipan. I have no idea whether they'll be successful or not, but if you essentially assemble modular components a la Dell it's just not feasible. But here's the thing. Jobs' idea about the sand thing was surely hyperbole. Exaggeration for the sake of making a point. Taken literally it's an absurd absolutist perspective. Simply replacing standard components with proprietary ones won't do anything of itself. They components you make must be made with an end goal in mind or it will do nothing for you except some crude and unsustainable forms of "lock-in". It's self-defeating overall, as with most absolute goals. You want to make the most essential parts, and not the rest. That's why Jobs also cancelled several Apple proprietary technologies and replaced them with standards. Apple Desktop Bus with USB is the most obvious example. Making everything yourself would be self-defeating even if it were possible. Especially since some early technologies are or turn out to be transitional and are quickly replaced with something else. Better let the marketplace bear the cost of that churn, which are also opportunities for small fast moving businesses to make a killing too. And even if not transitional, better to have a few years to develop your own in the best way rather than rushing. I'm a sucker for tech industry history, and just an hour ago I started the book "IBM 360 and Early 370 Systems". In the introduction it says: "The present volume shows how the decision to displace existing computers [with a software unified line that could run same apps] was linked with two other far-reaching decisions, one to develop and use entirely new circuit components, the other to undertake the manufacture of essentially all needed circuit components within the company. A consequence of the decision to transform IBM from the leading buyer to the largest manufacturer of semiconductor devices. The considerations that led to these bold decisions, and the technical and managerial difficulties encountered on the way to their fulfillment, form the core of the present volume." OTOH, there have been tradeoffs in the name of simplification, with the downside of standardization, cost, and upgradability put to the wayside. ADC is one of those, with the hope of minimizing cords for the G4 Cube paving the way. These days it's still the push towards sleekness, but through no screws showing and being as thin as possible. The downside has been ease of opening/upgrading, and the passing in many areas of being able to add memory. This bugs the heck out of me and has been my top potential question for Tim if I was to make it down to a shareholder's meeting. To me, as an engineer instead of a designer, form should always follow function. And while I can see the decision being made with a smartphone that might have a half-life of 2 years and where a 4+ year life is more of an outlier, it really bugs me that iMacs/MacBook Airs/Mac Minis/etc are being made where there is no possible upgradability of the ram. I upgraded the ram in my SE (or was it an SE 30) and clock chipped my 6100. There should be a way to put memory in these current MacOS devices! Some day I'll at least send Tim an email about that, in a slightly compelling and more complete way. Until then, I'll have to buy a 27" iMac instead of a 21" if I want to be given the privilege of a potentially longer lifetime use. With kids, I can normally pass things down. OTOH, I'm using a now 10 year old 24" iMac, with 54 tabs open, and it ain't half bad. That's more a testament to there not being huge steps forward in CPU needs, at least in my world. Apple (and others) can make a huge upgrade jump here if they can give a good reason to upgrade. AR should do it, even if it's focused on portability. Until then, I just have to wait until we can't hold out 12 year old back from Fortnight anymore, or I need to give him (or me) the advantage of outrageously fast hardware. But while I got to hear for FN from a 2nd grader this week, my son at the moment is happy with our old-school Nintendo. And it's a great memory challenge to remember all the tips and tricks from playing Super Mario Bros ~35 years ago.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 24, 2019 8:21:09 GMT -8
OTOH, there have been tradeoffs in the name of simplification, with the downside of standardization, cost, and upgradability put to the wayside. ADC is one of those, with the hope of minimizing cords for the G4 Cube paving the way. These days it's still the push towards sleekness, but through no screws showing and being as thin as possible. The downside has been ease of opening/upgrading, and the passing in many areas of being able to add memory. This bugs the heck out of me and has been my top potential question for Tim if I was to make it down to a shareholder's meeting. To me, as an engineer instead of a designer, form should always follow function. And while I can see the decision being made with a smartphone that might have a half-life of 2 years and where a 4+ year life is more of an outlier, it really bugs me that iMacs/MacBook Airs/Mac Minis/etc are being made where there is no possible upgradability of the ram. I upgraded the ram in my SE (or was it an SE 30) and clock chipped my 6100. There should be a way to put memory in these current MacOS devices! Some day I'll at least send Tim an email about that, in a slightly compelling and more complete way. Until then, I'll have to buy a 27" iMac instead of a 21" if I want to be given the privilege of a potentially longer lifetime use. With kids, I can normally pass things down. OTOH, I'm using a now 10 year old 24" iMac, with 54 tabs open, and it ain't half bad. That's more a testament to there not being huge steps forward in CPU needs, at least in my world. Apple (and others) can make a huge upgrade jump here if they can give a good reason to upgrade. AR should do it, even if it's focused on portability. Until then, I just have to wait until we can't hold out 12 year old back from Fortnight anymore, or I need to give him (or me) the advantage of outrageously fast hardware. But while I got to hear for FN from a 2nd grader this week, my son at the moment is happy with our old-school Nintendo. And it's a great memory challenge to remember all the tips and tricks from playing Super Mario Bros ~35 years ago. I admit I'm a cynic on upgrading. I started out in sales, believe it or not, just before the first integration wave with IBM's PS2. A lot of wailing and screaming over soldered components and few slots. My experience was that a tiny percentage ever upgrade, and the contacts points of swappable parts contributed to failures. I have the same view with Apple. Few ever upgrade. Now the additional driver is miniaturization. I have a MacBook Air that is 5 years old and going strong. I think the cost in reliability is greater than the cost of the few that would upgrade if they could. I think I could upgrade the disk on my Air, but I don't want or need to. But even on this view, you could say Apple should have set the minimum ram to 8GB before they actually did. It does certainly put a premium on a manufacturer selecting sane capacities at a given point in time for things that aren't replaceable. I think Apple generally does this, but probably not always. iPhones at 16GB were a case in point.
|
|
4aapl
Moderator
Posts: 3,631
|
Post by 4aapl on Apr 24, 2019 12:40:32 GMT -8
But even on this view, you could say Apple should have set the minimum ram to 8GB before they actually did. It does certainly put a premium on a manufacturer selecting sane capacities at a given point in time for things that aren't replaceable. I think Apple generally does this, but probably not always. iPhones at 16GB were a case in point. I'm still using a 16 GB iPhone 5S, and have just learned that I can't have everything on it. When my playlists had to get smaller and smaller, I started deleting some of the photos off it, deciding that while I didn't want them in the cloud, I also didn't really need 5 years worth of photos. It's amazing how much that cleaned things up. Now I haven't worried about space in a year, though it was the move from "any song you own" to "any song you own and really want to listen to" that made it possible. OTOH, on both my iMac and my wife's former MacBook Pro, they had originally had their ram upgraded to 4GB. Jumping a year ago to 8GB made a nice bump up in these old machines. At some point it's an environmental thing, in that making it difficult or nearly impossible to add or replace ram, HD/SSD, or battery, they end up being recycled or going into the landfill before their time. Nothing lasts forever, but some things have a much longer useful life than others. I'd rather Apple made more of their devices so that they could be kept going a little longer, even if most don't. But maybe many of them realistically are closer to an appliance like a refrigerator, where there might be a couple things that can be feasibly replaced, but really it's made to last a long time and if any serious issues arise, it's really just time to upgrade.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 24, 2019 17:57:41 GMT -8
That's pretty amazing life out of a iPhone 5s. Well there you go, maybe 16 MB wasn't a bad idea.
I'm also skeptical I need that many photos, but my wife has her photos too and wants to keep them. Since I want a backup for data and I don't want to keep all photos on phone either, I started using iCloud recently and sent all the imported photos on my mac to iCloud and set my wife's phone to ship them there too. I think I've worked around an issue that I hope gets taken care of by a new iCloud feature. Namely a collective repository for multiple users. A family plan doesn't do that. Setting multiple phones to use the same apple ID is no good because it mixes other stuff too, but setting same ID on my mac's Photos app and importing photos through it I think works. Or maybe they could split all functions apart so you could set an Apple ID uniquely for every app.
|
|