Steve Haslam (araqnid) wrote,
Steve Haslam
araqnid

  • Mood:
"Failure to think things through": old script to copy large files from the server where they are produced and install them on several front-end servers:

  • gzipped the file on the smbfs filesystem
  • copied the gzip output to the local machine
  • Moved the gzip output into the archive directory
  • install the gzip output on each remote machine
  • ssh to each remote machine to gunzip the file


The smbfs access is over an expensive (slow) link, all the other servers talk over a cheap (1Gb Ethernet) LAN. Spot how the file gets copied via smbfs twice (in fact, twice uncompressed and twice compressed). Excellent. If you're going to go to this much effort to conserve network bandwidth, think it out first, ffs.

And don't get me started about the excessive shilly-shallying about it used to do in copying from one machine to another, and then that machine copying it to a third to get installed, none of which network tomfoolery had been required for the past year or so.
Subscribe
  • Post a new comment

    Error

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

    When you submit the form an invisible reCAPTCHA check will be performed.
    You must follow the Privacy Policy and Google Terms of use.
  • 2 comments