Ya......here is what you should do. Go get FOG Project It is a free and easy as ol hell to install on Ubuntu Server computer imaging solution. I use it here at home and my old school used it, It brings an easy to use interface and computer management system to the table. Even offers Automatic Active Directory Joining. Coupled with Deepfreeze by Faronics and you well never have a problem.....Please for the love of god stop using discs to re-image a Lab
Free Open Source Ghost https://fogproject.org/ The only part that I haven't done is the scheduling. Once you have that figured out the machine will just boot off the Network, image the drive and then boot to your normal OS
https://fogproject.org/ was my go to for many years before we moved to SCCM, and I still keep my original 0.29 server running for situations where SCCM isn't cooperative. Seems to be a lot more actively developed than it was the last time I had to set up a server so I imagine that process is simpler than it used to be.
The best system I've found for managing images is the Fog Project.
Basically, there's a server that outputs a menu to the target client on boot via DHCP/PXE.
That menu allows you to create an image of that specific client (or groups of similar clients), which is then uploaded back to the server.
The image can thereafter be downloaded to that client, or any similar client.
You can modify the software installed on the PC to your heart's delight but if you need to restore it to its original condition, you simply select an option from the PXE boot menu.
This allows you to create several images with different configurations for your PCs.
Restoring an image usually takes less time than making a cup of coffee.
Of course, for a Linux installation, you'd mainly be concerned with restoring the root partition (and other optional partitions).
One can get quite fancy with this tool. It's entirely possible to have different distros saved as images.
You can manage thin client images or fat client images.
This is not the only way to do this, but it does make things easy to manage.
FOG with PXE, we are mainly a Windows shop but have some Linux here and there. Free, open source, supports multicast and has a bunch of other bootable tools built in (memtest, testdisk, NT password reset, etc).
Go the MDT route. Relatively easy to set up and the net is full of docs on in. MDT is also free. SCCM, while a full featured suite that could do significantly more than just imaging, isnt free and can be a somewhat complex setup. I saw fog the other day, and while it looks pretty slick, i dont have any experience to fully vouch for it.
Make sure the switches you use support PXE.
https://docs.microsoft.com/en-us/windows/deployment/deploy-windows-mdt/deploy-a-windows-10-image-using-mdt
Good luck man.. er um.. woman? happy friday
There's also FOG, but this isn't really a networking question.
You may want to ask /r/techsupport (to help you get clonezilla working), /r/homelab (for suggestions on imaging software), or if the amount of machines to image is significant, /r/sysadmin.
If you ignore the insane "live migration" component of your request, https://fogproject.org/ is what I have experience in imaging win7. There were zero options for live systems however, but I haven't dealt with that software in, jesus, 10 years? more than 10 years?
I'm confused as to why you'd even be trying to restore an out of support end of life os at all, you clearly don't have access to the extended support so IMO your goal here should be getting RID of the win7 system entirely and migrating whatever garbage is on there, to an actual server os. Barring that, at least an OS that's still supported.
Or, you know, buy the extended support and install acronis and keep paying for extended support AND lock that win7 box in a closet with no network access.
Is the Web UI still finicky and picky? The last time I remember using it I just got frustrated with the webUI being so vague and unintuitive. The input fields were always menacing and unhelpful. The interface felt like it made the service harder than it actually should of been to use. Loading ISOs seemed to randomly fail, trying to deploy an image seemed to timeout and fail sometimes. I didn't have much confidence in it.
Linux is always more lightweight than Windows. However, which is a better PXE server for you depends on what you need to do.
If you will:
Then a Windows server is the right choice.
Otherwise, use Linux.
If your PXE needs are pretty simple you can do a manually setup PXE server using; SYSLINUX, Grub2 or iPXE. If you have more complex needs (multiple OS's, multiple boot types (BIOS/UEFI) and/or multiple archs (x86, x32, arm, etc.) then you likely want to go with a complete solution.
I would recommend FOGProject. Its relatively painless to setup, easy to use and intuitive, well documented and the forum is active and is a great source for getting help. It really accomplishes 2 main things:
So you can use FOG to capture/deploy OS images to client machines for example. PXE is technically used in the process but you are not deploying an OS install (like using a CD/DVD/USB stick to install). Or you can use it to PXE boot an installer, so that you can do a clean install on a networked machine without needing physical media with you.
FOG leverages iPXE which is extremely flexible and powerful but also complex to setup and manage manually. FOG makes it very easy to work with iPXE.
I have FOG currently able to install via PXE (not including images):
I can also use it to boot:
I could not even get close to the above abilities manually setting up SYSLINUX or Grub2 across BIOs and UEFI for x86.
I inherited a FOG Server, and have really been enjoying it; The issue that is most limiting is my network ops people refusing to enable PXE Boot on the network, so I either have to do one-off deployments, or plug PC's into a separate switch to do multi-cast deployments...
I honestly couldn't recommend it enough.
EXCEPT - I have since discovered we have licensing for a commercial product which is faster and more suitable, so I'm concentrating on moving to that; If not for that turn of events, I'd still be happily using FOG, though
Just thought I'd chime in with this. The FOG Project is a great open-source tool for computer cloning and image management.
Copied right from the wiki: >FOG is more than just an imaging solution, FOG has grown into an imaging/cloning and network management solution. > >* PXE boot environment (DHCP, iPXE, TFTP, fast HTTP download of big boot files like kernel and initrd) >* Imaging of Windows (XP, Vista, 7, 8/8.1, 10), Linux and Mac OS X >* Partitions, full disk, multiple disks, resizable, raw >* Snapins to install software and run jobs/scripts on the clients >* Printer management >* Change hostname and join domain >* Track user access on computers, automatic log off and shutdown on idle timeouts >* Anti-Virus >* Disk wiping >* Restore deleted files >* Bad blocks scan
Very versatile and useful for machine management!
we use this: https://fogproject.org/
we can deploy an image to windows 10 on the dell laptops in under 5 minutes and that's then good for the user to go. the domain joining function is also good too.
I moved to this the other week and haven't looked back. As far cloning solutions go, it has everything.
Oh god I'm doing it again...
Oof, you've your work cut out for you. You're a teacher I'm guessing? The good news is it seems like you're pretty much starting blank slate, that makes things somewhat easier.
Get PDQ Deploy and Inventory, it will make your life a lot easier for maintaining software packages.
If you want to go the in-house OS deployment route, check out FOG Project.
Add all workstations/laptops/users to the domain. This can be automated.
15 hours a week is not enough, unless they're happy with broken stuff for days/weeks on end. You'll need some help. Hop on the k12sysadmins slack and we can probably swing some advice as you need it.
Not quite sure what exactly it is you're looking for. But where I work we use FOG to deploy the image and join the machine to the domain. Specific apps/shortcuts/etc are then either deployed using FOGs snap-in tool or through group policies.
Is this image supposed to be deployed to multiple machines or is it more of a backup of a single machine? For the former I use FOG and the later I usually go with dd (backing up my / to somefile.tar.gz), of course you could use FOG for this as well I just haven't bothered to set it up at home.
EDIT: Now with linux, you probably want to regenerate ssh hostkeys and such if it's going to be deployed on multiple machines.
First off, do you have the infrastructure to support a network bootable (PXE) deployment scenario?
If so, using MDT is a free option that a lot of companies use. There are even powershell scripts that will setup an MDT server for you auto-magically. Let me know if you are interested and I'll provide a link to one.
If you're looking to deploy other operating systems, and are technically inclined with Linux, the Fog Project is a great alternative open source option.
If you're looking for a Microsoft solution, look at Windows Deployment Services (WDS), which is a role on Windows Server (howto here). Otherwise there are many 3rd party deployment apps around such as Acronis Snap Deploy, or free/OSS ones like FOG
> * Windows 10
Kind of a given, get Pro minimum.
> * i7+
Overkill for the majority of applications. Go i5.
> * At least 16gbs of ram
You can get by with 8, 16 is better.
> * SSD
Yes. Easily has more impact on speed than more RAM.
What you didn't touch on is WiFi. Get something with 802.11AC minimum, 802.11AX preferably. Intel if you can. This way you're somewhat futureproof and Intel plays nicest with the majority of WiFi deployments out there.
I like to give out laptops with full size keyboards, not everyone uses the numpad, but those who do will be very happy. This instantly reduces the amount of models you can pick from. Also, get a screen resolution of 1920x1080 minimum. The 1366x768 a lot of them still come with is atrocious and a lot of external equipment doesn't like it.
Pick a business line like Dell Latitude, HP Pro/EliteBook, Lenovo Thinkpad T/X Series. You can get better warranties, get access to business support channels and there's driver packs available.
> I would love the laptop to have an easy way to replace the hard drives so I can keep master clones of machines and replace when I need to.
What? No offense, but cloning and swapping harddrives is needless busywork that can easily be automated. You shouldn't have to touch laptop internals ever. unless it's break/fix. Read up on MDT or FOG for a deployment solution.
if you have no partitions then yes it's be helpfule to do that
effectively
this all can be configured using things like FOG or MDT a bit easier than doing manually
although doing it manually is a good learning exercise
Depending on what resources you have available and your comfort level, if MDT or SCCM pose issues for you, we’ve had great success using FOG project (https://fogproject.org). Happy to share more info/specifics if you’re interested.
Another shout out to https://fogproject.org. Can use the API to initiate tasks if the in-built scheduler isn't enough. Or change properties.
Handles MANY other things for environments that don't/can't fully use AD.
Good performance, web-based management for remote imaging,
Fog service plugin that will handle stuff like "Your computer needs to image. Rebooting in 2 minutes" and "Your computer was just imaged and needs to join the domain, doing it now", and "Your computer name doesn't match (either from an update in AD or from just being imaged), changing name now and rebooting in 2 minutes". And more.
I'm a high school student (senior) who got to work with the tech department as an intern over the summer. When we refreshed our PLTW labs, we set up one machine with all of the software needed (Chrome, the Autodesk suite, Office, RobotC, etc). We used FOG to create an image of that configured machine and, over the network, pushed that image to all of the new machines. However, we still have to manually tag, label, rename, and add each machine to the domain. The machines that we bought from HP came pre-installed with Windows, but we imaged them with Windows 10 Education.
I use FOG Project at work, in a 250+ workstation environment that is constantly receiving new images. Takes some setup and knowledge of switches and domain controllers if you're using Windows Server DHCP, but works better than anything else I've ever used. Puts all other imaging systems to shame. And it's FREE.
What is happening when you try? I assume you have the Ubuntu NetBoot.gz.tar file. You have set up tftp? Do you have the extracted files in /srv/tftpboot? Or whatever name you gave that directory?
I am going to need a little more information on your errors you are getting.
You could also try use FOG server if this isn’t working.
https://fogproject.org/
You could just use the open imaging platform that's considered the linux competitor to WDS, FOG. https://fogproject.org/
It supports driver injection and post-imaging tasks, however if you can get away with it just install all the software on the primary image and activate it after deployment.
I've heard a lot of people that have used the FOG project for deploying both Linux and Windows over the network. I'm not sure if it would work for OSX as well, but if you could manage all three systems with the same utility, I think that would be a huge win just generally speaking. https://fogproject.org/
I was in the same boat as you a few weeks ago. We needed a free solution so all of the Microsoft tools were out of the question. We ended up using FOG Project. Definitely worth checking out. Also has some other features for deploying applications and also keeping track of inventory by registering each machine.
Do you have Windows Server or can you spin up a VM?
It takes a little effort but you can use the features that exist with Windows to setup a fairly slick imaging system that will get your machines imaged, drivers and key software installed, activated and joined to the domain.
Deployed thousands of PCs just by using this methods - it's easier if you SCCM, but it isn't a requirement.
You can also use FOG - haven't used it for a while but it looks like it is still being updated - https://fogproject.org/
I'm literally just starting to play around with FOG at work, so far it's pretty impressive. I'm using it for windows images, but I understand it works for linux, and "may" work for OS X but may need other tweaks, but IMO OS X doesn't really need to be imaged, just having a Munki server to deal with software deployment is all I need, since OS X isn't loaded with crapware like windows usually is.
You might look into Fog. If all you're needing to do is deploy workstations with a standard configuration, you'd go through and configure your base image, then clone that image to your fleet as you need to hand out devices.
As for what registry keys to change and manipulate, you just really have to dig into it. Once you get an image the way you want it, you'll need to make sure to update it on a regular basis as well. Quarterly or Bi-annually, boot the image up, apply patches across windows and any 3rd party software you have installed, update 3rd party software if appropriate, etc. Then bake it back down to an image for cloning again.
Not a Windows guy; Security, Incident Response & nix
If all the hosts have identical hw... I've reimaged 00's of Win7 hosts from a raw dd image using a fog server
edit: link format
PXE boot is mostly used for mass deployment of OSs to hardware. Think a coroproate environment where you've got 100s of machines and want consistency across all the systems for OS setup/configs.
Granted this is mostly for Windows but I think an imaging solution that might for for linux/Pi would be FOG.
​
You might want to look at fog: https://fogproject.org or maybe clonezilla.
As for running pxe broadcasts across an already configured network is not advised, as most pxe environments require dhcp server running an IP helper service to point to the pie server.
More information about the pxe boot process:
https://docs.microsoft.com/en-us/troubleshoot/mem/configmgr/boot-from-pxe-server#pxe-boot-process
https://docs.microsoft.com/en-us/troubleshoot/mem/configmgr/boot-from-pxe-server#pxe-boot-process
You could set this up, it is free and open source imaging client https://fogproject.org/
I also work for a small law firm, I do plan to eventually set up some automated process as well when I have time.
You can use a tool such as FOG (https://fogproject.org/) to deploy a known state to all the machines and then still use ansible to update them regularly with security updates, etc.
I would be more concerned about configuration drift on the machines than a bad update taking them all down. (ie, a student installs a new, unstable package on a single machine, etc)
When I used to work in education, I re-imaged all labs between terms, to make sure they were in a known state.
I've heard good things about the paid version of Macrium Reflect. Because my environment is fairly small and budget is always an issue, I've used Clonezilla to do one off clones as needed. Even when replacing all of our public computers for example (we have 30), it wasn't THAT bad to image them one by one.
With any imaging software, deploying to dissimilar hardware (as in, different in ANY way) may lead to issues. Ideally you want identical hardware.
https://fogproject.org/ is an option for mass cloning over your network. I went through setting it up years ago only to find out that the specific bios used on our public computers was not compatible with it. Huge bummer. All that work, ready to clone.... aaaand it's not compatible. However, setting this up is probably more work than you're looking for if you just want to do a one off clone.
> So my first question is this: how can one create an iso of a custom windows 10 installation. Either on a machine or in a vm.
> Secondly how can you install that iso on another system? Would rufus do the trick?
Rufus will not pass secure boot because it uses a custom boot loader chain. It tells you this once created. For the security of the machines, secure boot is likely on (I hope). Instead just format a USB as FAT32 and copy all the contents from a Windows ISO over except the install.wim file. You can split your custom made install.wim into a parts to copy to the USB sources
folder. The resulting USB is ready to go.
> A third question is for my longterm goal. We have a fog server at my school some external partner set it up years ago. But nobody knows how it works... can you guys give me any info/documentation or point me to the right subreddit about how fog servers work in practice?
You will need to reach out to higher ups to locate the information to find out who set it up. Someone was paid to do this and as a result there is a paper trail with a phone number or email somewhere. Once in contact you can integrate your workflow into it if desired. Or you can start here: https://fogproject.org/support and https://wiki.fogproject.org/wiki/index.php?title=Main_Page
I see, looks tight. It partly seems to me similar to what thefog project is doing. To be honest, I was hoping someone would for the netboot manager from this project, to make it a standalone netboot manager. It seems so powerful!
Have you considered an image capture/deployment solution?
At my office we use use FOG, you can find more info about it at https://fogproject.org
We run a mixed OS environment of about 300 machines, mostly Ubuntu as a cost cutting measure, but management uses Windows 10. We set up a computer, install packages, remove packages, and otherwise create our ideal OS to meet current business demands, then capture it using FOG. The way we keep image sizes down is to make sure the computer we use for building the OS uses the smallest hard drive we currently have on hand. I have never actually dug into how small the images end up being but there's a level of compression that takes place. Our FOG "server" is a repurposed Dell Optiplex, proving the whole thing can literally run on a potato.
I've also had success capturing and deploying images with CloneZilla, it doesn't require any central server or even network infrastructure, just the ability to boot from USB. I've been able to get images down as far as 5-10GB depending on what I was actually adding to it/using it for.
Do you not re-image machines with a corporate image before sending them out?
EDIT:
> solved this issue by having a master image that is deployed via WDS/MDT/SCCM etc. but that's not always an option for everyone.
It isn't? Even if your company is strapped for cash and won't approve a turnkey solution, you can build a FOG box from an old desktop to use as an image pusher.
We use one that is completely standalone and just has a dumb switch attached to it that we use for imaging.
> would allow for any type of computer
Well, not all computers support PXE so one solution won't cover everything.
What I used was FOG Server setup on a VLAN with a few dedicated network ports.
Not sure why you want a mini computer, or what exactly you mean by that. I personally would dig up an ancient desktop with a 1TB hard drive and 4-8GB of RAM, should cost you almost nothing and will push out an image at at least 100Mbit.
The FOG project is something to consider as you can make an image and then deploy it to any registered or unregistered host however is limited compared to SCCM as you can not deploy drivers or applications unless they are built into the image. At my work, we use FOG and basically create an image for each type of computer, capture the image, and then deploy.
For deploying applications, inventory, and remoting into computers (which I used to use SCCM a lot for) we use ManageEngine which is free up to 25 computers and can be self-hosted.
Not sure if this would be a good fit as it been a while since I look at it. But back when I was using Norton Ghostsuite we had look at https://fogproject.org/ as a potential replacement. I left that company before the project went anywhere but it maybe worth a look.
It sounds like you might be looking for an imaging software solution like https://fogproject.org/. This would be a full hard drive image rather then an incremental backup. You might consider combining the two and use fog for imaging the os base and installed applications once you've finished your initial setup and then use an incremental backup program for data.
PXE boot and DBAN (use Fog if you like GUI, or use DHCP + TFTP if you prefer the CLI for the PXE boot server configuration).
No need to extract the disk from the machine, just plug the machine into the "cloning" network switch, power it up, get into the bios (F12 or escape key on most machines), choose "boot over the network", click enter, go to the next machine, rinse, repeat.
Gamify it with your colleagues for more fun and to go even faster.
Here is another option. FOG project sets up a PXE server for you in just a few steps. It's way friendlier.
Though PXE being another option, is not as easier as handling the partitions and bootloaders when doing this kind of installation.
Hi, Do you want to install a new PC with Windows 10 using a Win10 image?
If the answer is yes, you could use FOG Project https://fogproject.org/
A complete and good tutorial https://www.ceos3c.com/open-source/install-fog-server-ubuntu-server-16-04-lts-ultimate-guide-virtualbox/
We've had a lot of success with FOG: https://fogproject.org/
We regular reimage loads of Windows, Mac and Ubuntu laptops with sved images we've got in a library. It's really easy - boot the laptop off PXE, select the image, wait, reboot.
Might be worth a look
Call me old school .... but wireless? Eew. Any way you can cable it?
For machine backups, have you thought about Fog Project?
Build looks sensible enough for Plex and everything else.
I know its a tired mantra, an you are probably aware, but will say it anyway... RAID is not a backup. If you care about the data ensure a proper solution in place.
Depends how you want to deploy. Do you want to deploy images to blank computers or do you want to physically have the SSD with you in your workshop, deploy the image onto it and then install the SSD into a machine?
For deploying/cloning/imaging machines over the network look into FOG Project: https://fogproject.org
In addition to deployment tools you can also have bootable/live ISO files on the network to boot from via FOG, such as memory tests, offline virus scanners and any other tool you'd bot via a CD/USB/ISO file.
If you had optical media and a machine with an optical drive you could pull the drive out of the laptop, put it in that other machine, install there, and then move the drive back.
You could also set up a PXE server. You could use something like FOG to do some of the lifting for you: https://fogproject.org/
To be honest you'll probably be best off just buying a cheap USB stick.
Hi,
We use FOG. Its really simple to use and setup. No extra wistles and bells which is how I like it. Too often software tries to do too much and does not get the basics right. Plus its free.
We use only HP's machine: HP Elitebook 850 for mobility, HP Z4 and HP Z2 for the guys who use autodesk product and HP Prodesk 600 for the accountant.
And w use FOG https://fogproject.org/ for the images.
Ok you search rather a deploy solution. I had understand for example you want save computer if the disk crash. For this i use FOG (https://fogproject.org/) for Windows and Linux operating system. Before upload my Windows image just launch sysprep to "generalize" the image. MDT/WDS do the same thing but don't know them.
If you need some imaging, take a look at fog project: https://fogproject.org/
We also use that here in small company, use it to install Windows on legacy/uefi systems very quickly. Including all the programs you might need.
Seconding the move to office 365. Much easier then doing it yourself. We also use azure AD to do all the heavy lifting regarding sign-in. Free with office 365.
We also use snipe it. Works great.
Just saw FOG in one of the pictures. That's something your sysadmin is using to manage the machines on the nework. I believe it's attached to https://fogproject.org/ , go talk to the sysadmin with that information and ask for a job (also be very open about how you found those things as to not have them be suspicious)
Hassbian has Hassbian Scripts, which is a set of shortcuts for some useful things, so I use that for updating the base OS via "hassbian-update" and Home Assistant via "haupdate" on the command line. I could even create a script in HA that called them, but haven't bothered yet. Actually, I'm going to do that now.
I keep my configuration in a private git repo, but don't have any real snapshot setup. You can always pull the SD Card/USB stick and image it, but that's a pretty manual process.
The one automated way I could think of for doing that would be to use a FOG Server and set it up to periodically PXE boot the Pi & image it, but that would be quite the project.
I honestly don't have a lot of experience with Docker but I do think it would be a much easier way to handle updates, snapshots & backups, whether via HassOS or Hass.io
Really the main reason I'm using Hassbian over anything else is, in addition to being comfortable with the base Linux OS, you can move the entire install to run off of a USB stick instead of the SD card. I've had the same installation running for 13 months (including updates & restarts; not solid) with no corruption or major issues.
It seems that your comment contains 1 or more links that are hard to tap for mobile users. I will extend those so they're easier for our sausage fingers to click!
Here is link number 1 - Previous text "FOG"
^Please ^PM ^/u/eganwall ^with ^issues ^or ^feedback! ^| ^Delete
Fog is fantastic I know that it is being used on a multisite deployment for a community college's computer science department in Texas for manageing thousands of student workstations.
As Blowmewhileiplaycod says, you can also use the remote console in iLo to act as a virtual kb+mouse and monitor. Some are old and crappy and use Java however that isn't supported by modern browsers, so your luck may vary. However, once you get a monitor plugged in, if it's running Windows or something else, how will you log into it? It'd likely be different credentials from whatever the iLo had (or didn't have if it wasn't changed).
If you had 100 servers, you'd likely use PXE to boot/install a custom image that had specific configs set up so you don't need to manually go through installer prompts and can set certain things up on setup, like remote desktop enabled. I've never done anything too extensive with it and Windows, but last I looked years ago, https://fogproject.org/ was one option of handling all of that.
I use an image with our main ERP software, office, antivirus, utilities like chrome, cute pdf etc all installed. Once it boots I activate the license and login as the user. I use FOG. I have a script to grab shares and printers if they're wonky.
> endless "searching for updates"
Set it to high performance power setting and go home - I've seen that take 8 hours before. If you image computers a lot, look into MDT or Fog and create an image with all the updates already done, then update it quarterly.
Use Snappy Driver Installer: http://snappy-driver-installer.sourceforge.net/en/download.php