this post was submitted on 19 Dec 2023
160 points (81.2% liked)
Technology
59357 readers
6005 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It was mostly marketing. Especially back when they started and the market was saturated with computer manufacturers all churning out their own computers that didn't interoperate well with others. It saved educators and then businesses time because they didn't have to waste time re-educating students or employees on a new system. This especially bore fruit as computers started gaining power and the ability to perform functions that had been relegated to mainframes, meaning experience with computer type X could become central to that role. I really think Apple took Moore's Law to heart and projected out the future of the role of computers in business as a result of it and the increasing shrinking of components. Why pay for a super expensive powerful mainframe when only a few people in a company might need that much power and the rest need far less? More cost effective to buy a few powerful desktops and save tens to hundreds of thousands on a mainframe.