Friday, November 23, 2007
New Encyclical
Rumor has it: Benedict's second encyclical, Spe Salvi ("Saved In Hope"), has a definite release date set for next Friday, November 30.
Read It And Think
Posted recently on InsideCatholic.com:
Really this should speak for itself. It's a metaphor, people.
Really this should speak for itself. It's a metaphor, people.
Tuesday, November 13, 2007
Perl of Wisdom
This is for the spiritual geek in all of us: a copy of the Lord's Prayer that compiles in Perl.
#perl of_wisdom
#perl of_wisdom
open(Matthew6,$0); {print <Matthew6>;}
our %father; $who = "in heaven";
bless \$thy[name];
for ($earth .. $heaven) {
$thy[kingdom] + $come;
$thy[will] = $done;
}
sub mit { our $bread = localtime(today);
foreach $sin (@us) {
undef my $sin
if my $forgive;
return if $temptation or $evil; }}
for ( $thine = $kingdom and $power and $glory) {
for ($ever and $ever) {}; }
Monday, November 12, 2007
MySQL Binary upload
All the best evidence says: don't put binary objects in a MySQL database because it is horribly inefficient. You will inevitably be labeled a "noob" with much eye rolling if you even ask this question.
The plain fact is, you may want to be able to do this anyway, even if in the grand scheme of things this is a Bad Idea(tm).
So stop telling me NOT TO, and start telling me HOW.
The basic problem is this: the single best and fastest way to get ANYTHING into a MySQL database is to use the LOAD DATA LOCAL INFILE method, which does a mass import into the selected database. This is even better if it is multiple records in a delimited format. The problem with this approach is that it simply inserts the data into the next auto-incremented slot. You cannot control WHICH record it will insert into.
The other proposed method is to read a file into a $variable and do a simple INSERT INTO ROW VALUES ($variable). Using this method you can control WHERE this will get inserted. The fatal flaw with this method is that $variable is interpreted by your scripting language BEFORE it is inserted. This is really ugly and usually ends up breaking the INSERT syntax due to ticks, quotes, interpreted commands, and sheer length.
So how to fix? The simple solution is to perform a READ of MAXLENGTH BYTES inside a loop and sequentially update the record(s) of choice.
Pseudocode:
This avoids MAXLENGTH problems in both the MySQL client and server. What you end up with is chunks of your binary in multiple records that you control. To reconstitute the file, you add code to your script to track the chunks and download them in order.
Thanks to gmax at perlmonks org for this great code snippet called:
The plain fact is, you may want to be able to do this anyway, even if in the grand scheme of things this is a Bad Idea(tm).
So stop telling me NOT TO, and start telling me HOW.
The basic problem is this: the single best and fastest way to get ANYTHING into a MySQL database is to use the LOAD DATA LOCAL INFILE method, which does a mass import into the selected database. This is even better if it is multiple records in a delimited format. The problem with this approach is that it simply inserts the data into the next auto-incremented slot. You cannot control WHICH record it will insert into.
The other proposed method is to read a file into a $variable and do a simple INSERT INTO ROW VALUES ($variable). Using this method you can control WHERE this will get inserted. The fatal flaw with this method is that $variable is interpreted by your scripting language BEFORE it is inserted. This is really ugly and usually ends up breaking the INSERT syntax due to ticks, quotes, interpreted commands, and sheer length.
So how to fix? The simple solution is to perform a READ of MAXLENGTH BYTES inside a loop and sequentially update the record(s) of choice.
Pseudocode:
WHILE still BYTES_IN_FILE {
READ (FILEHANDLE, BYTES, MAXLENGTH)
UPDATE ROW with VALUES (val1, val2, BYTES)
}
This avoids MAXLENGTH problems in both the MySQL client and server. What you end up with is chunks of your binary in multiple records that you control. To reconstitute the file, you add code to your script to track the chunks and download them in order.
Thanks to gmax at perlmonks org for this great code snippet called:
Handling huge BLOB fields with DBI and MySQL
open FILE, "< $fname" or die "can't open $fname\n";
my $maxlen = getmaxlen();
# gets the value of max_allowed_packet
my $bytes=$maxlen;
$fname =~ s{.*/}{}; # removes the path from the file name
my $sth = $dbh->prepare(qq{
INSERT INTO software_repos
(name, vers, bin, description, filename, username, updated)
VALUES ( ?, ?, ?, ?, ?, user(), NULL)});
# now we read the file and upload it piece by piece
while ($bytes) {
read FILE, $bytes,$maxlen;
$sth->execute( $sname, $vers, $bytes, $descr, $fname)
if $bytes;
}
close FILE;
Subscribe to:
Posts (Atom)