Maintaining Hardware / fesability of keeping hardware running

Open discussion on or around SGI; software/hardware related posts should go in the appropriate subforum.
Forum rules
Any posts concerning pirated software or offering to buy/sell/trade commercial software are subject to removal.
MrBill
Posts: 271
Joined: Fri Dec 28, 2012 12:50 pm
Location: Vero Beach, Florida

Maintaining Hardware / fesability of keeping hardware running

Unread postby MrBill » Mon Nov 20, 2017 9:20 pm

Sgi hardware has been a big thing for me. I learned about the hardware while it was still in its prime and i was still in school, and decided i really wanted to get some of the gear that was top notch at the time. I can't even begin to gauge how much time has been spent working on or thinking about getting things working on these machines. Its cool stuff, but im having some second thoughts here.

Hardware keeps failing across the board on my desksides, and i can not seem to get ahead on the cost of replacement parts. between the crimsons and the onyx 10000 deskside and rack, shits going bad faster than i can swap it out. Part costs on ebay are borderline insanity for parts that are right on the edge of failure themselves. The desksides have always been the holy grail of hardware i one day wished to own. I really would not want to trade this stuff for anything in the world, but i am getting concerned about the long term feasibility of keeping these things running. I am running into several issues. I had two crimsons, one of which has been sold.

The crimson I still own has developed a power supply issue. Sometimes it will get to the prom, other times it does not even get that far. It was running for a few months, i was using it for a server at the time. I have been unable to get the power supply issue fixed, and on top of that it has an issue with the io3 controller now too.

I feel like a jerk. I sold the other crimson, it was working flawlessly up until that point when it left, but a couple weeks after it was shipped it developed a power supply issue similar to my own. Its to the point i do not even want to risk selling something as working else i run the risk of it failing on arrival and being blamed for selling broken junk.

The deskside onyx 1000 rack has had issues since day 1, prom battery issue, not all memory is recognized. Still usable in this shape. Again, using it on a day to day basis, it usually runs non stop. Today the keyboard locked up mid use and is no longer detected.

The rackmount terminator rack takes up a quarter of the already small room, and i have been unable to even get it to power on. It takes 3 phase power, ive tried runing 2/3 of the power supplys on a regular 220 outlet. Essentially an expensive large black rectangle that does not power on.

I don't really know what i want to accomplish by posting this here. Perhaps just get out whats on my mind. These frequent hardware issues really have me concerned. I never really anticipated things to fail so frequently. Its getting costly, and its borderline insanity paying top dollar to keep such old gear running. Again, I like this stuff, its some of the coolest gear on the block no one else has, but at the end of the day, realistically i could be using a laptop to do the same job. I might just hit the "F it" moment here, and migrate everything to a poweredge server and have a sgi clearance sale. I never thought i would say such a thing,i never though i would ever get rid of this stuff for anything, but again, im really concerned about the long term usability of these machines. It is getting harder and harder to keep modern software running on them too. I had to just stop and think about it, how much time has been spent just in workarounds and tweaks to make these things usable, both on the hardware and software side.

I could just keep them to the side and have some nutcase sgi museum in my house (worked fine for me up till now) , but whats the point? If its not being used, I may as well not have it.
:Octane: :Octane: :Octane: :Octane2: :Indy: :Indy: :Indigo: :Indigo2: :Crimson:Image :Tezro: :Tezro: :Onyx2: :Onyx2: :Onyx: :OnyxR: :BA213:

User avatar
Irinikus
Posts: 363
Joined: Wed Apr 27, 2016 4:25 am
Location: Cape Town, South Africa

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby Irinikus » Mon Nov 20, 2017 10:16 pm

MrBill wrote:The desksides have always been the holy grail of hardware i one day wished to own. I really would not want to trade this stuff for anything in the world, but i am getting concerned about the long term feasibility of keeping these things running.


At some point, we're all probably just going to have to come to the realization that we cant keep all of these machines running for ever! :(

In my case, what I plan to do is retire all of my SGI machines, while they're still in good working order except for one (probably the Tezro) and simply keep them as collectors pieces. (At least I will know for myself that they were working when I retired them).

The nice thing about the Tezro is that for now, spares are rather easy to come by once you actually have the machine. (Obtaining a good Tezro chassis is the challenging part!), and if I have to compare it to my other SGI's in terms of performance, it's far better IMHO.

It's rather sad, but it will be far easier and cost effective to keep one good (fast) machine going than trying to keep them all going.
Last edited by Irinikus on Tue Nov 21, 2017 12:17 am, edited 1 time in total.
Image...........................Image Image ImageImage Image Image Image Image
Image.............................Image Image Image Image Image Image Image Image
Image... Image AlphaStation 255/300MHz.
Image................Image 486 DX4 100MHz. (My first Computer)
Image.........................Image Image JavaStation-10(Krups); Sun Fire V880z.
A snapshot of what I've got:.......................................................................:arrow:

User avatar
Dodoid
Posts: 643
Joined: Mon Jul 04, 2016 1:36 pm
Location: Ottawa, Canada
Contact:

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby Dodoid » Mon Nov 20, 2017 11:47 pm

I think that there may be some systems that will outlive others. Left in a vacuum, the cheap, abundant, beginner friendly, reliable systems of today, like the Octane or Indigo2 would probably outlive the expensive, rare, hard to operate, and hard to keep going systems like the Onyx or perhaps even the Tezro. That said, the rarer systems are also a lot cooler. I'm a lot more fussed about getting my broken Onyx running than my broken Indigo2 (especially since one of my Indigo2s works, so the other may as well be a parts box anyway), or the Octanes in the "corner of IP30". That may lead to the more notable or cool systems getting more attention from owners when they break, and it could even offset the difference.

I'm also hopeful for the future of SGI repairs. Think about it. We can already 3D print drive sleds, emulate SCSI devices, install IRIX over the network with DINA, run L3 controller software in a VM, fix Toshiba CDROMs with a well known, well understood procedure (gluing the gear), back up and fix EFS and XFS filesystems with GNU/Linux boxes, reanimate dead Dallas chips with a dremel and careful wiring, replace fans with quieter, more reliable alternatives, replace Fuel power supplies with Kuba's board, emulate pre-Indigo2 keyboards with the heatshrinked adapters everyone seems to have, manufacture new 13w3 adapters for the first time in years, and most importantly talk about all of it on Nekochan. Think about that for a second. An Indigo with dead fans and drives, no keyboard or monitor adapter, and an internal battery that's dead as a doorknob can be fully restored to perfect working order by someone who got into SGIs yesterday, only has PC peripherals, and has no access to IRIX install media just by reading some Nekochan posts, without ever touching an original SGI replacement part.

Just think about what we could do a few years from now. Someone on this forum is already mapping out the O2 frontplane. What about building a totally new drop-in power supply system for the Onyx, emulating the System Conteoller, (M)MSC, or SN1 L controllers with an Arduino? 3D printing totally new skins? Sure, we might not be able to build a brand new R16000, or replace a dead CRM chipset with an FPGA, but those aren't common points of failure.

Basically, as hobbyist/maker technology has advanced over the decades since SGI systems were being released, we've gained new ways to repair and replace essentially all of the common failed parts. Sure, if an unreplacable chip dies, maybe we won't be able to fix that, but if it's a capacitor, a Dallas chip, a fan, a drive, a piece of power circuitry, or any of the things that we actually see failing on a regular basis, I think it's not unreasonable to say that we can probably fix those.

Not every SGI will survive for forever, but if it's honestly something we care about (and I have every indication that we do), we certainly have the skills, knowledge, and technology to keep a lot of units of a lot of models of SGIs running for a lot of years into the future.

I certainly plan to try my best to keep mine going.
:Onyx: :O2000: :Fuel: :Octane: :Octane: :Octane: :O2: :O2: :Indigo2: :Indigo2: :Indy: :Indy:
and a small army of Image

Shiunbird
Donor
Donor
Posts: 415
Joined: Fri May 06, 2016 1:43 pm
Location: Czech Republic

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby Shiunbird » Tue Nov 21, 2017 2:06 am

I guess the most complex ICs can't be repaired, but they are not the ones that break easily anyway.

Capacitors, resistors, batteries, power supplies, etc., that's all resolvable. The only limitation is the skill of the engineer.
ImageImage

User avatar
jan-jaap
Donor
Donor
Posts: 4937
Joined: Thu Jun 17, 2004 11:35 am
Location: Wijchen, The Netherlands
Contact:

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby jan-jaap » Tue Nov 21, 2017 2:42 am

I've had my fair share of these problems over the years. I too used to own an Onyx that would eat one or two boards every year. I must have changed most of the system over time and it just would keep failing. I hated that thing. In the end someone gave me an Onyx IR and this one never had a single problem (well, it had a faulty RM when I got it and I still haven't been able to source a known-good replacement for acceptable money). My Challenge L never gave me any grief either. I have a Professional IRIS which appears to have a fault in some RAM buffer on the GM board causing a DMA error on the CPU when starting NeWS (or X11). For years I searched for a replacement and when I finally found it, it threw the exact same error :(

Some systems seem to age better than others. Everybody knows that the RM boards of the Reality Engine (and RE2, and Onyx1 IR) fail *a lot*. I have the feeling this is a case of bad soldering joints due to big SMD components on large, and rather flexible, PCBs, possibly in combination with thermal stress of power cycling. The older VGX boards are through hole technology and I've not see one die in 15 years. Same for the much more rigid IR boards of the Onyx2/Onyx3000. Could it be SGI learned a lesson here?

In general, I've always tried to hoard as many spare parts as possible. It's one thing to mod a CR2032 onto a dead Dallas, but to repair SGI iron at the component level is something different. I sometimes envy the people who restored that Alto recently. Schematics! Off-the-shelf components! There's no practical way to diagnose and fix SGI big iron beyond the replace-a-board level. The field diagnostics may sometimes give you an indication of which quadrant of a PCB to look at, but to give you an idea of what you're up against, here's an IP7 CPU board. There are four of these in a 4D/380. In total there are a dozen or so similarly filled boards in such a system:
IMG_1404_filtered.jpg

As you can see, there are two daughtercards; each with a CPU, an FPU, cache memory and some logic ASICs. The chips with the white part# stickers on the are GALs. Internal configuration undocumented, of course. Further back, on the main PCB there are many more, plus some ASICs for good measure. It's possible to identify some serial port controllers and the PROM chips.

If such a board throws some mysterious PowerPath error at you (this happened to me) you're pretty much lost. No way to link this to a chip, and even if you could it would probably not be something you can order a replacement.

Still, this board has been "repaired" by me at some point. If you look carefully, you'll notice one FPU has a copper lid and the other one is a plain ceramic "tile". The most common failure mode for these boards is a broken cache SRAM chip, and that happened to one of the CPUs on this IP7. But the daughtercards can be detached. If you have another IP7 with a similar problem you can combine the good daughtercards on one IP7 and have a working board again. No such luck with it's successor, the 40MHz IP15 board. Those CPUs need a heat sink, so there's no space for daughtercards and everything is on the main PCB. I have two IP15 boards in storage, both unusable due to a broken SRAM chip :(

Power supplies also die quite often, especially the big ones. Again, I keep spares. Some can be repaired more easily than others. The PowerOne PSUs of the older PowerSeries are somewhat documented and modular. It's possible to start them up with a dummy load on a bench. I wouldn't know how to do that with the PSU of an Onyx2, which reduces repairing it to poking around looking for dried up caps.

I wouldn't be surprised if my 4D PowerSeries will outlive many of the newer systems. Time will tell. I too sometimes wonder what I'd do if one day half my systems were unusable. It would be depressing.

As far as using these systems and the amount of runtime they get: I don't run them 24/7 in server tasks (except my Origin 350). Also, when I'm working with them in the evening hours it's not like I power up all 20 or so of them and them frantically hammer away at 20 keyboards like some mad scientist. It's usually limited to one or two systems. My family likes to claim some of my time as well. So it's not uncommon that a couple of months pass in before I power up that Onyx or Indy again.
:PI: :Indigo: :Indigo: :Indy: :Indy: :Indy: :Indigo2: :Indigo2: :Indigo2IMP: :Octane: :Octane2: :O2: :O2+: Image :Fuel: :Tezro: :4D70G: :Skywriter: :PWRSeries: :Crimson: :ChallengeL: :Onyx: :O200: :Onyx2: :O3x02L:
To accentuate the special identity of the IRIS 4D/70, Silicon Graphics' designers selected a new color palette. The machine's coating blends dark grey, raspberry and beige colors into a pleasing harmony. (IRIS 4D/70 Superworkstation Technical Report)

User avatar
Irinikus
Posts: 363
Joined: Wed Apr 27, 2016 4:25 am
Location: Cape Town, South Africa

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby Irinikus » Tue Nov 21, 2017 2:55 am

Which system with good performance do you think would probably be the easiest to keep going far into the future?
Image...........................Image Image ImageImage Image Image Image Image
Image.............................Image Image Image Image Image Image Image Image
Image... Image AlphaStation 255/300MHz.
Image................Image 486 DX4 100MHz. (My first Computer)
Image.........................Image Image JavaStation-10(Krups); Sun Fire V880z.
A snapshot of what I've got:.......................................................................:arrow:

User avatar
jan-jaap
Donor
Donor
Posts: 4937
Joined: Thu Jun 17, 2004 11:35 am
Location: Wijchen, The Netherlands
Contact:

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby jan-jaap » Tue Nov 21, 2017 4:50 am

Irinikus wrote:Which system with good performance do you think would probably be the easiest to keep going far into the future?

Define "good performance" ;) Seriously though: if a 5 year old MacPro crushes an 8 CPU Onyx 350 by an order of magnitude, how do you think any SGI will compare to a contemporary system 10 or 15 years from now?

When a German Fraunhofer Institute bought a 4D/380 in 1990 it was worthy of press releases etc. It cost them a million DM. Today, I own that machine and it is every bit as capable as it was back then. At the same time, it's performance is basically zero by today's standards (I'll try to remember to run C-ray on it next time it's up). That same fate is awaiting every other MIPS/IRIX system. The 15 year old Octane2 is a lot faster than the 25 year old Indigo, but neither one of them is capable of browsing the web in a meaningful way, something that even the cheapest smartphone can.

Performance doesn't matter. What matters is that it can run meaningful and interesting applications. Applications it ran 15 or 20 years ago maybe when it was new and state of the art. And unless it dies, it will still run those applications 15 years from now.

There's no guarantee that any single system will still work tomorrow. That's not how statistics work. That said, certain 'risk factors' apply:
* Anything that moves wears out. Think: hard disks, fans etc. As long as this doesn't cause secondary damage (heat death due to broken fan) it's not a big problem because these are relatively generic parts.
* If it runs hotter, it dies faster.
* Any power supply will die.
* It takes only one bad component for a board to be unusable. More components means bigger risk of board failure.

Add this up and you'll see why the Onyx1, which tuns at least one 1kW into heat in a beer fridge sized box, with it's countless components and separate power bricks on just about every major PCB doesn't stand a chance.

What worries me a bit with the last generation of MIPS systems is they add a lot of complexity which is not related directly to actual computing, yet increases the risk of failures. Octanes that can't read their NIC chips. Fuels and O3x00 series with faulty environmental monitoring chips. My own Fuel is down, not because the CPU died, but the I2C chip that communicates the CPU serial# to the L1 doesn't work.

The best chance (IMHO) would be a box which runs relatively cool and is simple. With a relatively simple PSU which can be serviced by a generic technician. The Indy R5000. The Indigo. But hey, by the same rationale the O2 should do well also, but somehow it doesn't seem to be the case.
:PI: :Indigo: :Indigo: :Indy: :Indy: :Indy: :Indigo2: :Indigo2: :Indigo2IMP: :Octane: :Octane2: :O2: :O2+: Image :Fuel: :Tezro: :4D70G: :Skywriter: :PWRSeries: :Crimson: :ChallengeL: :Onyx: :O200: :Onyx2: :O3x02L:
To accentuate the special identity of the IRIS 4D/70, Silicon Graphics' designers selected a new color palette. The machine's coating blends dark grey, raspberry and beige colors into a pleasing harmony. (IRIS 4D/70 Superworkstation Technical Report)

User avatar
Trippynet
Donor
Donor
Posts: 812
Joined: Thu Aug 15, 2013 6:22 am
Location: Aberdeen, Scotland, UK

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby Trippynet » Tue Nov 21, 2017 5:20 am

For me, the Indigo2 has to be one of the more solid systems - but the higher end Max IMPACT cards are a lot more fragile than Solid IMPACT due to TRAM issues. Part of the issue with later systems as well is that SGI was doing a lot more poorly financially, so some corners were cut to try and reduce the BoM for them - hence the Fuel being in a more standard PC tower type case. The Indigo2 and Indy however were made when SGI was flying and didn't need to cut costs on their systems quite so much, hence I do find they tend to be more robust electronically.

For all of Fuel's issues, one advantage it does have though is that it's now easy to run it with a normal PC power supply thanks to Kuba's adapters. Given that PSUs are one of the most common points of failure on old systems, this is useful. Of course, it doesn't change the fact that the rest of the hardware in the Fuel does tend to be more flaky than earlier systems.
Systems in use:
:Indigo2IMP: - Nitrogen: R10000 195MHz CPU, 384MB RAM, SolidIMPACT Graphics, 36GB 15k HDD & 300GB 10k HDD, 100Mb/s NIC, New/quiet fans, IRIX 6.5.22
:Fuel: - Lithium: R14000 600MHz CPU, 4GB RAM, V10 Graphics, 72GB 15k HDD & 300GB 10k HDD, 1Gb/s NIC, New/quiet fans, IRIX 6.5.30
Other system in storage: :O2: R5000 200MHz, 224MB RAM, 72GB 15k HDD, PSU fan mod, IRIX 6.5.30

User avatar
Irinikus
Posts: 363
Joined: Wed Apr 27, 2016 4:25 am
Location: Cape Town, South Africa

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby Irinikus » Tue Nov 21, 2017 5:56 am

jan-jaap wrote:
Irinikus wrote:Which system with good performance do you think would probably be the easiest to keep going far into the future?

Define "good performance" ;) Seriously though: if a 5 year old MacPro crushes an 8 CPU Onyx 350 by an order of magnitude, how do you think any SGI will compare to a contemporary system 10 or 15 years from now?


I meant "Good Performance" in SGI terms (among the various SGI machines).

In other words, which "high end" SGI machine do you think would be the easiest to maintain into the future.

jan-jaap wrote:Performance doesn't matter. What matters is that it can run meaningful and interesting applications. Applications it ran 15 or 20 years ago maybe when it was new and state of the art. And unless it dies, it will still run those applications 15 years from now.


I understand and agree with you completely, however if I look at it, as it is today for example: Even tough they are much slower than other machines that I have at my disposal, I can still handle the speed of my Tezro, Octane2 and Onyx2, as they can at least run IRIX properly (windows come up quickly).

However, anything below the O2 R12K 400MHZ is simply too slow for me to personally enjoy (even just for clicking around in IRIX).

If a system can at least run it's operating system properly, then it won't become irritating to use, no matter how old it it or how much faster the contemporary computers of the day are.

If I am to play with, and enjoy IRIX on an SGI into the future, I would much rather do it on an SGI with some grunt "in SGI terms", hence my question.

However, I will still end up by owning most, if not all of the SGI's produced, preserved in a functional state for collectors purposes (excluding the various rack-mount systems, as they don't interest me at all, as there are smaller and IMHO more collectable SGI's which possess the same technology).
Last edited by Irinikus on Tue Nov 21, 2017 7:04 am, edited 4 times in total.
Image...........................Image Image ImageImage Image Image Image Image
Image.............................Image Image Image Image Image Image Image Image
Image... Image AlphaStation 255/300MHz.
Image................Image 486 DX4 100MHz. (My first Computer)
Image.........................Image Image JavaStation-10(Krups); Sun Fire V880z.
A snapshot of what I've got:.......................................................................:arrow:

User avatar
Irinikus
Posts: 363
Joined: Wed Apr 27, 2016 4:25 am
Location: Cape Town, South Africa

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby Irinikus » Tue Nov 21, 2017 6:27 am

Trippynet wrote:For me, the Indigo2 has to be one of the more solid systems - but the higher end Max IMPACT cards are a lot more fragile than Solid IMPACT due to TRAM issues. Part of the issue with later systems as well is that SGI was doing a lot more poorly financially, so some corners were cut to try and reduce the BoM for them - hence the Fuel being in a more standard PC tower type case. The Indigo2 and Indy however were made when SGI was flying and didn't need to cut costs on their systems quite so much, hence I do find they tend to be more robust electronically.


The Indigo2 Impact is an awesome system and definitely on my to-do list for all of the reasons you mentioned.
Image...........................Image Image ImageImage Image Image Image Image
Image.............................Image Image Image Image Image Image Image Image
Image... Image AlphaStation 255/300MHz.
Image................Image 486 DX4 100MHz. (My first Computer)
Image.........................Image Image JavaStation-10(Krups); Sun Fire V880z.
A snapshot of what I've got:.......................................................................:arrow:

Jack Luminous
Donor
Donor
Posts: 200
Joined: Mon Sep 26, 2011 12:59 am
Location: PARIS

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby Jack Luminous » Tue Nov 21, 2017 6:51 am

I only run an Octane and an O2. I have almost every part in double (or triple sometimes). I even have a spare dual R14K/600Mhz module for the Octane just in case. Every once in a while when I'm short on money I'm thinking about selling it but then I think "what if the damn thingamabob fails ? I will never be able to afford another one."... Keeping those things running can drive you crazy ! :lol:
:Octane2: :Octane: :Octane: :O2:

User avatar
Irinikus
Posts: 363
Joined: Wed Apr 27, 2016 4:25 am
Location: Cape Town, South Africa

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby Irinikus » Tue Nov 21, 2017 6:55 am

Jack Luminous wrote:I only run an Octane and an O2. I have almost every part in double (or triple sometimes). I even have a spare dual R14K/600Mhz module for the Octane just in case. Every once in a while when I'm short on money I'm thinking about selling it but then I think "what if the damn thingamabob fails ? I will never be able to afford another one."... Keeping those things running can drive you crazy ! :lol:



The Octane2 R14K 600MHz is an awesome machine and in my opinion a very good machine to focus your resources on, to keep it running well into the future. (it runs IRIX properly) :D
Image...........................Image Image ImageImage Image Image Image Image
Image.............................Image Image Image Image Image Image Image Image
Image... Image AlphaStation 255/300MHz.
Image................Image 486 DX4 100MHz. (My first Computer)
Image.........................Image Image JavaStation-10(Krups); Sun Fire V880z.
A snapshot of what I've got:.......................................................................:arrow:

User avatar
jan-jaap
Donor
Donor
Posts: 4937
Joined: Thu Jun 17, 2004 11:35 am
Location: Wijchen, The Netherlands
Contact:

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby jan-jaap » Tue Nov 21, 2017 7:56 am

Irinikus wrote:However, I will still end up by owning most, if not all of the SGI's produced, preserved in a functional state for collectors purposes

That's more or less what I pieced together, though I certainly didn't start with that plan in mind. And in the process of acquiring all that hardware I ended up with an attic full of largely untested parts. Now I'm trying to sanitize all of that a bit. This week I'm going through boxes full of keyboards and mice, and when I'm done I will have 1 set per system, cleaned up, tested and labeled. The rest (at least one crate) will have to go.
Irinikus wrote:(excluding the various rack-mount systems, as they don't interest me at all, as there are smaller and IMHO more collectable SGI's which possess the same technology).

I drew the line at tall rack systems. And then I cheated and got a 4D/380 Predator rack :mrgreen:

At some point when you have lots of systems, various infrastructure bits make sense and they usually come in 19" form factor (console server, network switches, FDDI concentrator, disk arrays, ...). Given all this equipment, a 19" rack makes sense. A 42U rack takes as much floor space as a half high one and it's not like you'd use the empty space above it, so you might as well get the tall rack. And then your next server might as well be a rack mount because that's space efficient. And that Origin 350 would fit nicely as well. You see where this is going.

Actually, I'm quite happy with this arrangement. By stacking all infrastructure in the rack, I avoid a lot of clutter in the room in between the systems. The only downside is that I had to buy a lot of cables again, in longer lengths.
:PI: :Indigo: :Indigo: :Indy: :Indy: :Indy: :Indigo2: :Indigo2: :Indigo2IMP: :Octane: :Octane2: :O2: :O2+: Image :Fuel: :Tezro: :4D70G: :Skywriter: :PWRSeries: :Crimson: :ChallengeL: :Onyx: :O200: :Onyx2: :O3x02L:
To accentuate the special identity of the IRIS 4D/70, Silicon Graphics' designers selected a new color palette. The machine's coating blends dark grey, raspberry and beige colors into a pleasing harmony. (IRIS 4D/70 Superworkstation Technical Report)

User avatar
Trippynet
Donor
Donor
Posts: 812
Joined: Thu Aug 15, 2013 6:22 am
Location: Aberdeen, Scotland, UK

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby Trippynet » Tue Nov 21, 2017 9:09 am

Irinikus wrote:The Indigo2 Impact is an awesome system and definitely on my to-do list for all of the reasons you mentioned.


Indeed, it's not a super quick system by later SGI standards of course (my Fuel can run rings around it performance wise). You have to remember of course that the motherboard technology is from 1992, and you can tell when using it that memory and I/O performance really were pushed to their limits a few years later when the R10k was added. The 10MB SCSI is also a bit limiting, although at least the sluggish network performance can be improved via a 100Mb NIC.

But despite it being a bit slow compared to some other systems, it's built like a tank - and I can honestly say my Indigo2 gets powered up far more often than my Fuel does. I only really use the Fuel for tasks that are beyond the Indigo2s realistic capabilities (textured 3D stuff, compiling bigger apps or those that need > IRIX 6.5.22 features, playing around with most emulators etc).
Systems in use:
:Indigo2IMP: - Nitrogen: R10000 195MHz CPU, 384MB RAM, SolidIMPACT Graphics, 36GB 15k HDD & 300GB 10k HDD, 100Mb/s NIC, New/quiet fans, IRIX 6.5.22
:Fuel: - Lithium: R14000 600MHz CPU, 4GB RAM, V10 Graphics, 72GB 15k HDD & 300GB 10k HDD, 1Gb/s NIC, New/quiet fans, IRIX 6.5.30
Other system in storage: :O2: R5000 200MHz, 224MB RAM, 72GB 15k HDD, PSU fan mod, IRIX 6.5.30

User avatar
Irinikus
Posts: 363
Joined: Wed Apr 27, 2016 4:25 am
Location: Cape Town, South Africa

Re: Maintaining Hardware / fesability of keeping hardware running

Unread postby Irinikus » Tue Nov 21, 2017 9:56 am

jan-jaap wrote:
I drew the line at tall rack systems. And then I cheated and got a 4D/380 Predator rack :mrgreen:

At some point when you have lots of systems, various infrastructure bits make sense and they usually come in 19" form factor (console server, network switches, FDDI concentrator, disk arrays, ...). Given all this equipment, a 19" rack makes sense. A 42U rack takes as much floor space as a half high one and it's not like you'd use the empty space above it, so you might as well get the tall rack. And then your next server might as well be a rack mount because that's space efficient. And that Origin 350 would fit nicely as well. You see where this is going.


Rack-type systems such as the 4D/380 Predator, Onyx and Onyx2/Origin 2000 are really cool in my opinion and such systems may be considered way down the line.

However, rack-based systems such as the O350, O3000, Onyx4 and Altix are the type of systems that don't really interest me and I certainly would'nt go to any great amount effort to acquire any one of these systems.

I way prefer the deskside and desktop options here.(The Tezro and Prism desksides for instance.)

In your case, the need for a 19" rack makes allot of sense and yours really looks neat!
Image...........................Image Image ImageImage Image Image Image Image
Image.............................Image Image Image Image Image Image Image Image
Image... Image AlphaStation 255/300MHz.
Image................Image 486 DX4 100MHz. (My first Computer)
Image.........................Image Image JavaStation-10(Krups); Sun Fire V880z.
A snapshot of what I've got:.......................................................................:arrow:


Return to “SGI: Discussion”

Who is online

Users browsing this forum: No registered users and 1 guest