Entries : Category [ Computers and Technology ]
[Miscellaneous]  [Computers and Technology]  [Travel]  [Education]  [Hacks]  [Robotics]  [Science]  [Programming and Software]  [iPhone]  [Digital TV and Video]  [Intellectual Property & Copyright]  [Personal] 

11 February
2007

img_Mar_08_2007_09_11

img_Mar_08_2007_10_20

I am attempting to repair a drive problem in a Powerbook G4 aluminum 15-inch dual-layer SD (superdrive) computer. This is the one with a giant spring-loaded screw on the back to release the battery. On older models like the pismo, the keyboard lifts up easily, and the drive is right under it.

With this model, the hard drive is in the front of the machine and is accessed by removing the whole upper case including the keyboard. This is even worse than the similar operation on a powerbook titanium.

This was assembled by a truly evil mind! To open it, you need to remove tiny screws on the back (about 6), 4 on each side, two small hex screws on on the top near the display, 2 scrws inside the battery compartment, 2 screws inside the memory compartment. (To get the small hex screws out, I had to insert two extra-small slot heads at once to apply enough tension to rotate the screws.)

This is a huge pain and does not seem to be documented elsewhere. All together I count 23 screws, but I might be missing one. After that, to remove the drive you need to remove 3 screws on the tension bar on the right on the drive. They do a low of wiggling and prying.

img_Mar_08_2007_08_04


By Gregory Dudek at | Leave a comment |    
14 February
2007

Today, February 14th, the International Intellectual Property Alliance complained to the US government about Canadian copyright laws. The first tip that these guys may not be completely even-handed frank is the name of the ogranization, "the International PA" which (according to their own web page was formed "..to represent the U.S. copyright-based industries..." Their desire is to force Canada to come into line with US information control policies. This kind of effort is also being pursued via US pressure with GATT, which has led several countries to adopt regulations similar to the infamous Digital Millenium Copyright Act (DMCA) which goes as far as preventing people from fiddling with the internals of electronic devices (or media) they purchase and own.
...


I think there is a real need for the authors of "intellectual property" including books, music and movies to make money from their activities. In fact, these seem to be the predominant forms of value-generating activity left to North Americans in the 21st century. On the other hand, the regulations being pushed by the DMCA, GATT and the IIPA are aimed at concentrating wealth and power in the hands on companies (and people) who already have a lot of it. They reduce the extent to which information and creative output can move within society and, as a direct side effect, seem to stifle the ability to generate new creative content (since the old stuff remains equally valuable, and the intellectual tools to create new material are tied up and restricted).

In short, Canada needs to resist the efforts of the IIPA and form it's own policies as an independent country. Failing that, Canada should simply join the United States and be able to vote and have a say in the government that is controlling it's policies. This might have a genuinely helpful effect on both the Canadian and (optimistically) the US administration and economy. Either solution is probably good. What's bad is letting US interests control Canadian policy without having genuine representation with the government (potentially US) that determines the policies that effect Canadians.

By Gregory Dudek at | Leave a comment |    
17 February
2007

Nicholas was part of the St. George's School team participating in the CRC Robotics competition called Archemedia 2007. They teams built robots that have to haul 1 foot diameter rings around an arena and stack them up. Their team came in second place overall, as well as doing well in several sub-categories. This includes having a kiosk (booth) that placed in the top 6, for which I believe Nicholas was in charge. This kiosk featured a huge metal dragon (the school emblem) on a pneumatic cylinder that allowed it to go from floor level to about 30 feet in the air. The dragon blew smoke too. The background of the kiosk featured pictures of the team members and biographical information. In addition to a kiosk and a robot, the teams also prepared a journal documenting their activities, a video and a web site.

Nicholas in their booth
(click to enlarge)



One of the striking things about the competition is that is has the hype of a college football game. This includes large stands with screaming spectators and plenty of jumping around. Many other schools were involved including a technical school for adults, an assortment of private schools, CEGEPs (junior colleges) and some huge public high schools. Other notable features were a kiosk built by ECS School in which the students dress as mermaids, including having costumes with a single lag, an impressive robot from Laval that could pick up many rings at once (see below), and a kiosm constructed from water bottles. In this latter case, the audience was challenged to guess how many bottles the kiosk was constructed from. [ I guessed 3782, was within 14 of the right answer, and it won me a huge pile of candy. ]

Overall this kind of event is great promotion for science and engineering.

Laval Robot (a competing team)


By Gregory Dudek at | Read (2) or Leave a comment |    
22 February
2007

Due to a flood in our basement, we were forced to do some cleaning up recently. I found an old Next Computer NextStation and a Next Cube. These are attractive old computers that were cited a while beack as one of the "most collectible" tech artifacts from the last millenium. Each of these was made to run NextStep, the UNIX variant that is the predecessor to Apple's OS X, but NextStep is hard to find and exotic. I decided to install a normal UNIX variant which would be booted over the network, that is, using netboot. The Next hardware architecture is supported by NetBSD and the installation and network boot went smoothly except for a few glitches and problems which I am documenting here any other who may do the same
thing.

For starters, if you netboot using DHCP, as most people will today, then be sure to use full pathnames as the boot names to download. Missing this kind of issues that to a lot of wasted time as the next made successive "RRQ" requests, but was never able to get a file.


My NextCube had no SCSI disk, but the NetBSD kernels I used (version 3) seem to expect something on the SCSI bus (at least on my cube, but odds not on my NextStation). This result of the missing SCSI disk is the error message "esp0: SCSI bus reset" which seems to repeat forever. This same problem has been documented on other NetBSD architectures as well. The only solution I found was to recompile a kernel without the esp SCSI driver (which I did not have time to do). If you recompile a kernel on the 68040 25MHz NextStation, it seems to take over 3 hours.


The ROM Monitor can have a password; mine did. To "fix" this, remove the battery from the motherboard for a while (2 to 10 minutes). I also noticed the machine was unresponsive and would not do anything at all without a good battery in place so if your Next Cube or NextStation seems dead and won't even turn on, this could be why.


By Gregory Dudek at | Read (2) or Leave a comment |    
27 February
2007

March 2nd I'll be giving a talk entitled "Underwater and Amphibious Robotics using Vision-Based Robotic Behavior" at the University of Central Florida. This will be part of the Electrical Engineering and Computer Science colloquium series. My challenge will be to present some of the cool overall system behavior and context, and still get to some of the interesting technical issues such as how we do Markov interpolation in this domain, or how the environment is modeled statistically.


By Gregory Dudek at | Leave a comment |    
15 March
2007

Background: Various groups track information flow by logging the URLs being exchanged when data is up- and down-loaded. An initiative is currently underway by the US Department of Justice, and already passed in Europe, to force internet providers (ISP) to log data transfers, and to retain these logs for a long time (despite the large amount of storage this requires). This is an ongoing effort and follows a
prior effort
in the same direction. It seems the objective is not to save the actual data, just the information linking the URLs that were used: where did you visit and what filename did you upload. Such information is already used for DMCA "take-down" requests and
the legal page
at the infamous Pirate Bay bittorrent tracker provides many (amusing) examples. It also represents a huge incursion into personal privacy which is threatening in many ways.

Utility?: While anti-terrorism is cited as one of the benefits of this plan, it seems unlikely that actual terrorists operate by uploading data to public sites. The initiative is more likely to be motivated by DMCA enforcement. Even the most trivial passwording and encryption by organzied groups (such as terrorists) gets around this measure. I suppose one still might be able to catch the a really foolish bad guy,
which sounds insignificant, but perhaps that not as irrelevant is it seems.

Related work:


Once such linkages between users and URLs are made, there are a lot of very interesting data mining possibilities. Google is surely looking at doing this right now for commercial purposes (e.g targeted advertising). This is very close to work we are doing (abstract)
(pdf) to unravel the positions of robots or sensors deployed in space. The connections one might get can be insigntful, but also very misleading at times, and this is worrisome. It means, in principle, that you could get into trouble for using certain goodle search words, without even download anything. This is akin to patrolling people's thoughts.

Circumvention:
In the case of web traffic and DMCA enforcement, however, it seems like this effort to simply log traffic can be easily circumvented or obfuscated. A current practice is to pursue people if an upload of theirs has been download "too many" times. If the data provider simply uses cryptric URL's and rotates them often, as illustrated below, then the logged data becomes almost useless. The URL doesn't tell you anything and surely doesn't prove much. (i.e. the URL for this article might be
blog/41 now, but tomorrow it becomes blog/21111). This makes permanent links tricky for the user, but many such links already lead only to index pages that provide the connections between the URL's and the description of the content they provide. Of course, one could log that too, but then it becomes much much more complicated since doing it would human intervention involves producing (essentially) a snapshot of the whole internet on a regular basis. In short, this proposal seems fraught with problems, but from a technical standpoint as well as with respect to personal privacy.



Try it: URL content changes after a few clicks.


The simple example shown here illustrates a URL (for the picture) that delivers different content at different times. Note that this is not the same as just changing the images linked into a page, because the actual URL of the image itself doesn't change, but the content it points to changes. The first time you click it you get an image of "secret" troop deployments (that might violate the DMCA). If you reload the same URL a few times, you get something more benign. Hence, knowing who accessed the URL doesn't provide any information, unless you actually store the data too (which isn't practical). (Approximate source code for above example here.)

By Gregory Dudek at | Leave a comment |    
Prev  1   2   3   [4]   5   6   7   8   9   10   11   12   13   14   Next