Monday, November 5, 2007

Using ls to look at files by the date they were added to your system

The ls -l command displays fairly detailed information about the files in the directory you are in. It displays the date the file was last modified -- which is known in Unix as the mtime of the file. However, there are other date/time attributes on the file that might be more useful. The mtime doesn't necessarily apply to the file's last modification time on your system -- so if you've deployed an install package, it will probably show you the last time the files were modified on the system on which the package was built. This is sometimes useful for telling how old a version you have, but not really useful for much else.

The ctime is the change time, but this is not the same as the file's modification time (mtime). The ctime is the last time the file's status was changed. So, oftentimes this will be the date the file was added to your system. Very useful indeed!

This command will display the file's ctime in the date spot instead of the file's mtime:

ls -lc filename

This will show you the directory sorted by the ctime:

ls -lct

This will show you the directory sorted reverse by the ctime:

ls -lctr

Friday, June 29, 2007

Wrap long lines using fold

To line wrap a text file at 80 columns, and only break at spaces, use the command:

cat filename.txt | fold -80 -s

fold can also be told to wrap at bytes instead of columns, but I've never been quite sure how that would be useful.

Saturday, June 23, 2007

Comparing two files using comm

comm compares contents of two files. It has 3 columns available in its output — the lines only in file 1, the lines only in file 2, and the lines in both. You'll need to sort both files first.

sort test1.txt > test1-sorted.txt
sort test2.txt > test2-sorted.txt

This will show you lines only in test1.txt:

comm -23 test1-sorted.txt test2-sorted.txt

This will show you lines only in test2.txt:

comm -13 test1-sorted.txt test2-sorted.txt

This will show you lines only common to both files:

comm -12 test1-sorted.txt test2-sorted.txt

We can also do some neat tricks with uniq/sort -u:

cat test1.txt | sort > test1-sorted.txt
cat test1.txt | sort -u > test1-sorted-u.txt

This will show you lines only in test1-sorted-u.txt, which means those are the lines that appear multiple times in your original test1.txt file:

comm -13 test1-sorted.txt test1-sorted-u.txt

Neat, huh?

Setting a lot of user passwords at once

Say you have a whole class of users who have lost their passwords. Or perhaps you've just created a set of users with some automated method that doesn't allow you to easily set their passwords (such as a scriptfile full of useradd commands).

You'll want to use a file containing the following information:

user1:passwd1
user2:passwd2

Then run the command:

cat passwords.txt | chpasswd

Don't forget to delete your passwords.txt file just after you're done. Seriously bad idea to leave this file hanging around.

If you're looking for a way to automatically generate a whole bunch of passwords that your users won't balk at, the pwgen command may be installed on your system already!

If not, and you don't want to go hunting for one, try this: http://www.multicians.org/thvv/gpw.html

Deleting a large file that’s causing problems

When you have a process that's going haywire and filling up a filesystem, you have to remember to kill the process that's accessing/writing the file before the disk space will be freed up.

If you're not sure what process that is, use the fuser command:

fuser filename

Sure, you can use options in the fuser command to automatically kill off all those processes, but that would make me really nervous if I wasn't absolutely sure what each of them was... killing one of those off could be worse than having a full filesystem.

Tuesday, April 10, 2007

csplit — splitting a text file into multiple separate files

Sometimes it's hard to get csv files imported into Excel. Especially if you have a long data file that needs to be separated out into different Excel sheets.

If you have data that needs to be in separate documents, and if the different areas in your data file are delimited from each other in some fashion, then you can use csplit to split them into multiple files. csplit means Contenxt-sensitive Split -- there's a command split that will just split the file based on number of bytes, but that's not what we want.

For example, if you had a file called input.txt that contains csv formatted spreadsheets, all beginning with the word Sheet by itself on a line, you can run the following command:

csplit -f sheet input.txt /Sheet/ \{99\}

This will split the file into 99 separate files that begin with the line Sheet. It will name them sheet00, sheet01, sheet02, etc. 99 is the max, but you can always run it again on the last file, which would be the remainder of your data after the previous 98 sheets were taken out.

It will fail if the number of files you specified is greater than the number of sheets that will be created. So do a grep -c /Sheet/ input.txt beforehand, subtract 1, and use that number. I played around with all sorts of backticks and things to try to get a version of the command that would do this for you, but it really ain't worth it.

Sure, given a bit of time you could probably whip up a perl script that would do this better and faster, and probably name the files appropriately. And remove the "Sheet" line while it's at it. But this is already there!

As an aside, I always put a space before any numeric codes in csv files that shouldn't be treated like numbers. Especially if they have dashes in them. The space keeps Excel from trying to strip leading zeroes or convert them into dates.

grep!

grep has a -q (quiet) mode.

This is GREAT for use in scripts. If the string you're grepping for gets a match, it returns 0.

No fiddling with comparison operators for your if statements!

#!/bin/sh

if grep -q needle haystack.txt
then
echo "I found a needle!"
else
echo "No needles here..."
fi

Saturday, March 31, 2007

So, it's not a Unix trick, but it's very useful.

We've got a bunch of users that we're converting over to Windows domain users. It's tough to move the profiles manually, as there are all manner of shortcuts, Outlook stuff, etc that point to the files in the spot they are now.

We found moveuser.exe (in the Windows Resource Kit). But it doesn't work with Administrator accounts! Oh noez!!! Some of our users are logging in as Administrator!

The way around it is thusly:

  1. Rename the Administrator account (say, to barbarella)
  2. Create a new administrator account
  3. Make the old administrator account that the user is using a Restricted User
  4. Put machine on the domain
  5. Reboot! not only does the machine need this to get onto the domain, it needs this to see that the old administrator account is no longer an administrator account.
  6. run 'moveuser.exe barbarella DOMAIN\barbarella'