SOLVED How locate in which disk bootloader is installed

Operating System & Version
Centos
cPanel & WHM Version
106

srkn61

Active Member
Jan 28, 2015
34
1
58
cPanel Access Level
Root Administrator
Hello,

tomorrow we have to take on of our disks from our provider hetzner. We are using root server with raid1 on company Hetzner. The man on phone said, which of the disks should be taken... I am not sure, which one I should take, because I dont know in which disk the bootloader is located... Can someone help to identify the location of the bootloader, because when I take the wrong disk maybe our server would not boot...

I only want to take off the disk without bootloader, I have to recover some files on it with the tool testdisk, it is for me to risky to recovery files on live.. So I have to take on of the disk which are working as raid1.. So which command is needed to find out the disk with bootloader and which one has not bootloader installed...

I attach files maybe it would help.... Are there other commands to find out which ssd of raid1 has the bootloader which one not... Because after take on of the ssd disks, the server should be able to boot without problems...
 

Attachments

cPRex

Jurassic Moderator
Staff member
Oct 19, 2014
12,499
1,971
363
cPanel Access Level
Root Administrator
Hey there! This isn't really a cPanel issue, but we'll likely still have some ideas for you.

From the output you've sent, I'm only seeing one disk there, but it's spread across two drives with the RAID configuration. In RAID level 1, the data is mirrored, so you'd have the same data on both drives.

So, I think the answer to your question, is they both have the same data and would both include the bootloader. This is something you should confirm with your hosting provider or datacenter, though, before performing any destructive work.
 

sparek-3

Well-Known Member
Aug 10, 2002
2,120
255
388
cPanel Access Level
Root Administrator
Are they BIOS boot or UEFI?

If they are BIOS boot then:

grub2-install /dev/sda
grub2-install /dev/sdb


Should ensure that grub is installed in the MBR for both disks.

Best advice. Pull one disk, keep it around until you're sure that the server will boot with the other disk. If it doesn't then putting the pulled disk back in should boot the system like it normally does.
 
  • Like
Reactions: cPRex

srkn61

Active Member
Jan 28, 2015
34
1
58
cPanel Access Level
Root Administrator
My provider says it is your problem to find out in which device bootloader is installed, we can give you no information. But the OS Image was configured by Hetzner, so they should know it but they are not motivated to give the answer...
 

ffeingol

Well-Known Member
PartnerNOC
Nov 9, 2001
765
310
363
cPanel Access Level
DataCenter Provider
fdisk list (-l) on the devices "should" show you if there is a partition marked for boot. Software raid "normally" only puts the boot partition on the first drive in the pair and you have to take extra steps to actually put a boot partition on the 2nd drive. YMMV :)
 

sparek-3

Well-Known Member
Aug 10, 2002
2,120
255
388
cPanel Access Level
Root Administrator
Ultimately you just really need to know how the server and drives are set up.

Is it legacy BIOS boot with MBR? Legacy BIOS boot with GPT? Or is UEFI?

If you have partitions larger than 2GB (? I think?) then it's GPT and not MBR.

If the partition table is GPT, then it's either Legacy BIOS boot with a BIOS Boot partition or UEFI with an EFI partition.

fdisk -l /dev/sda
fdisk -l /dev/sdb


will tell you if the Disk Label Type is gpt or or dos (which is MBR)

If it's dos (MBR), then

grub2-install /dev/sda
grub2-install /dev/sdb


should install grub2 on each individual disk's MBR which would lead to the mirrored copy of /boot on each disk for loading the kernel.

If it's gpt then you have to figure out if it's BIOS boot or UEFI.

Each disk (/dev/sda and /dev/sdb) should have a partition with type BIOS boot if it's BIOS boot. They would each have a partition with type EFI System if they are UEFI.

If only one disk has a BIOS boot or EFI System then... you may be SOL. That's the only disk that will boot. That's a flawed RAID setup, because if that disk dies the server won't boot regardless of the state of the second disk.

If both disks have a BIOS boot partition, then I believe

grub2-install /dev/sda
grub2-install /dev/sdb


will still install the necessary loading instructions into that BIOS Boot partition to lead grub to /boot on each disk to load the kernel.

If they are EFI System then it gets a bit more complex.

If they are EFI System you would need to make sure the EFI partition on each drive is mounted individually. One of them will be mounted on /boot/efi the other one would need to be mounted elsewhere (/boot/efi2 is common) and the contents of /boot/efi would need to be copied to that other EFI partition.

You would then need to use efibootmgr to setup that second disk to be bootable.

It gets really, really complex - I'm just going to stop here because there's just a lot of variable that go into it.
 
Last edited:
  • Like
Reactions: srkn61