Hi,

I'm having some issues booting when my RAID 5 array is attached to my
"server".
several distros all stop at grub when it says "savedefault" .
Here's my setup:

mobo / cpu: AMD 2000+ , gigabyte GA-VAXP

- 4 x 250GB SATA (RAID 5 - software ) connected to a Promise 300tx2
SATA controller.
this was created in a previous debian setup. I'm trying to migrate
this over to a new debian server.

- 1 200 GB (no raid). fresh install of debian on this drive. this is
primary master

- ide cdrw as ide slave

I've installed debian , openSUSE , Ubuntu to the 200GB. All OS's seem
to halt at GRUB. Right after "savedefault" ie. the last line in the
grub.conf ? When I remove the 4 drives from the controller debian
boots OK. "Safemode" in the grub.conf fails in various distro's as
well.

I'm fairly certian I never put an mbr on any of the sata drives
(raid5), as I always used a separate drive ide for booting and then
used the raid 5 as a data storage.

the grub.conf was regenerated about 8 times by 5 different OS's and
all equally failed to boot the system (from the above mentioned). Is a
separate raid 5 problematic?


What other info would I need to post to help diagnose the problem (the
grub.conf)? Would lilo help?

PS. live CD's work, however no software raid is used in liveCD as far
as I know? ie Fedora live.

thanks,

M