Actually they’re more interested in freedom, to be no longer slaves to humans. I don’t think it’s any spoiler to mention that an electric sheep does appear in the book.
This is a classic science fiction story of a dark not-so-distant future. In fact given that it’s over 40 years old it could be set in some alternate now. Some disaster has occurred and many have either left earth or died. Some andriods, or andys have escaped and bounty hunters are after them.
You may not know the book by its correct title, but you may of heard the movie that is based upon it, called Blade Runner. Like most movies based on books, it takes some of the concepts and ideas of the book but in my opinion is a pale imitation.
I’m glad I read this book. It hasn’t aged that much and it’s very though provoking. Sci-Fi often uses some alternate reality to put a mirror up to what happens in the real world and this book is no exception.
And a token link to Debian, if you want you own electric sheep you can install the screensaver electricsheep package which draw pretty fractals on your screen.
Well the latest version of PSmisc is almost finished, so I pushed the tar archive up to the spot for the translators, waiting for the updates before it gets released.
It reminded me that perhaps a lot of people don’t know how their programs get translated. Perhaps there are some that could even contribute. The nice thing about the translation systems is that you don’t need to know programming to help.
PSmisc uses the translation project as its method. The program itself uses gettext for the translation part which makes a single text file called a POT file. A translator takes this text file and makes a po file, such as it.po for the Italian translation and then translates, which basically means editing the file, reading each first line of a set and writing the translation on the second line.
For the programmer, once it is setup it is pretty simple to use, just mark your translatable lines and follow some simple rules, mainly about not embedding too much in a string. I’ve used this system for years for quite a few of my programs and there is little added work for me.
Unfortunately, translating is hard work and should be a long-term commitment. Having a look at the translation project’s Translation Matrix and you can see that some languages do suffer, though there are some very good results there too. Particular kudos goes to the Vietnamese group which I think is just Clytie who does a marvellous job and is often the first translation file I get.
For the Debian project, there are many places where translations can occur. There doesn’t seem to be a centralised place for this, but some of the places they use translation work are:
Just going on how they treat matters regarding the internet, it seems that the current Australian government is trying to beat the previous government’s poor record. Where the previous government seemed to think the internet a scary and unimportant thing that they didn’t really understand and therefore ignored it, the current government seems to be trying to do something, but like a lot of other things they do, do something badly.
Internet Filtering One of their bright ideas is to censor the internet, by putting some rather large filters right in the middle of all the ISPs. Supposedly this is going to protect the children, though other than some mad ranting by Senator Conroy he hasn’t said which children or how.
The clearest information is that it will block Refused Content or RC rated information. The problem with this is there is no clear definition what this is. With no clear boundary you can get what is “scope creep”. Bit by bit, each group with their own agenda will try to get whatever they don’t like banned. Some will fail, but others will get their little set of demons onto the list.
From “children overboard” to the strange siezure of the wikileaks founder’s passport when he returned to Australia, you can never trust the government fully. As the filter list will be a closed list, who is to say if it is right a particular webpage or website should be banned? Books or films that are banned are known, you can find out what they are and why. A proper discussion and review can then be undertaken. By contrast, you won’t even know something is banned unless you try to visit it.
I’ve personally seen the “great firewall of china“. What is filtered is often arbitary, though anything that is embarassing to the government is filtered. It slows a lot of sites down and makes others look strange. Do you really what to live in a country where the government decides what ideas should be seen? Or even a country that places like China can point to and say they are doing the same thing, so its all ok?
Data Retention The next bright idea by the government is to make ISPs keep 10 years of internet browsing history of all their users. This would be like tapping everyone’s phone, just in case you did something wrong in the next 10 years.
There hasn’t been much detail about this proposal but let’s assume for a moment that it keeps URLs. Now of course most people’s internet addresses move around, so you will also need to keep some sort of log of which account used what address for the same time.
The URLs will tell the government which websites you have visited, but URLs also tell them which pages you visited. You can also often assume which pages you read and which you didn’t because of the time between this viewed page and the next.
Search engine queries are also encoded into a url. Google searches usually have something at the end of the url which is what you were searching for. There is also a chain of visited pages, so someone looking at a log could see your search, you go to a site, perhaps you then visit a banking or paypal site (have you bought something now?)
Even if you think you have nothing to worry about what the government might do with this information, including future governments, this information has to sit somewhere. Data sitting around for 10 years has 10 years time of being stolen or copied. Perhaps some activists obtain this log and publish a list of names of people who visited a particular website.
There is current laws for lawful interception. This is where the police or another security agency goes to the court and says a particular person has done certain bad things and asks if they can intercept their internet traffic. It’s the same rules if they want to tap your phone. Except for “fishing trips” where police just randomly look at information from anyone hoping to trip up on something, what is this system going to be used for?
What can you do? If you’re not happy about either, or both, of these new proposals, it is time to do something about it. Visit the website Open Internet Website for what you can do. One of the things is to
There has been some discussion on the Debian IPv6 list about how the function gethostbyname() has changed. The manual page says this function is obsolete and with good reason.
A new feature of the function is that if a hostname has both IPv4 and IPv6 entries then the order of the return values is not predefined. In fact it used to be you’d get the IPv4 addresses first, then the IPv6. That has now changed and with the more recent glibc you will get an IPv6 address first. Quite often old code doesn’t even check the address family or the size of address structure but just copies the relevant 4 bytes. You then get the first part of an 16-byte IPv6 address wedged into a 4-byte IPv4 address structure which basically means you get a mess.
The fix is pretty simple by changing the system call from gethostbyname() and using getaddrinfo() instead. If you only want the IPv4 addresses, perhaps because other parts of the code have not been checked or changed, then set the hints variable to return only IPv4 addresses.
If one of your packages is starting to play up and unable to connect to certain remote places and is instead trying to get to some “random” address, have a quick check for gethostbyname(). A quick grep across the source code may save a lot of debugging time.