The still-relevant bits of this blog - an overview, 2003-2018
I've used this blog to jot down thoughts and ideas, which I sometimes want to go over. But by this point it contains so much content that it's hard for me to dive into it. This post is an attempt to provide a quick overview of its content, that: filters out posts that are no longer relevant for me, tries to organise the material into logical groupings, and summarises the basic point of each post.
On the topic of my thoughts and ideas, there's also what I've written on Hacker News, Reddit, Twitter, and Quora.
Basic Research
What is the value of basic research? Not individual pieces of basic research, but that entire category of research? My thoughts on how we can think about this question, and why basic research's value is significantly underestimated.Scientific and Technological Development
It is useful to see scientific and technological development as removing constraints on our ability to achieve goals. This demystifies this process, and suggests possible future developments.Information Ecosystem
The information ecosystem is all the means, across all of society, of storing, presenting, communicating, and using information. In that, how can we help truths to spread and hinder the spread of falsehoods? (one idea this post contains is a source outlining more applied kinds of mistaken reasoning and justification).Thoughts on how we can use fact checking, and linking to supporting evidence, to help improve the information ecosystem. A goal being to make 'providing supporting evidence' more of a norm. And some thoughts on how we can make fact checking practical and as easy as possible.
An authoritative source of current scientific opinion would help. A source for reliable statistics for facts like "X% of working biologists believe in the theory of evolution". It'd be an important resource for informing the public and those making social policies.
Bret Taylor's suggestion for a Wikipedia for data (statistics, listings, etc), which still doesn't seem to have come to fruition.
A few minor things. Identifiers for news topics would help. If a topic (like a particular accident that occurred) could be given a unique identifier, and if all reporting and follow-ups on it (by all different parties) referenced that identifier. Some articles suggesting improvements to news sites. Some sites for structured representations of debates on topics (though it is questionable how much value these have): Debatepedia, spacedebate.org (which uses the Open Debate Engine, as do these two).
Related: Clay Shirky on the use of evidence in society (his response to the 2007 Edge question). He explains why "even today, evidence has barely begun to upend authority", and why, despite this, he's optimistic this is changing.
Interactive Storytelling
Can we use interactivity not for player agency, but as a means of increasing their immersion in a character's perspective? Think of something more like an interactive movie or documentary than a game. I sketch a high-level vision for this.Some ideas for how we could use interactive storytelling teach skills like communication skills.
Minor:
A miscellaneous idea: could you use a character like a Nintendog in narrative-focused game?
(And not something I intend to pursue, but a thought on presenting stories through email correspondence. Including the idea of seeing the emails drafted in real-time.)
Though not necessarily something you'd use for interactive storytelling, here's a concept you could use in a computer game: to graphically represent a character's (like the one the player is controlling) perceptual attention. Not the area they're looking at, but the differing levels of attention they're giving to things within that area.
Cognitive
'Core world-model' as a Creole
Poor language/concept use
"Analog" properties, such as the level of mercury on a thermometer, are really those whose levels can change in increments smaller than we can perceive.
Treating the meaning of some words/phrase in terms of their meaning in the abstract can cause confusion. We need to think of what their meaning is in the particular context in question. Another example of this.
Popper's "What is Dialectic?" is an excellent example the interaction between language and thought and the problems that can be thus caused.
Species, Language and Reality: an example of how language misleads us. The notion of 'species' makes it seem like there's a lack of 'intermediate forms' in fossil records, where there's no actual such issue. (Also related to the 'evolution' topic found below).
Opinions Aren't Always Just Opinions: arguing against common view that opinions are unquestionable.
Phrasing and significance: some simple examples of where language is used to make something sound significant.
Thinking
I had a lot of difficulty summarising the following post, and I ended up mostly re-writing it in my attempt. The original also provides a potential explanation of why we tend to overlook the negative processes.
I think critical thinking is probably a lot more important than obtaining knowledge. Not that the latter is unimportant, but that a strong core of critical thinking is necessary, along with obtaining knowledge, for good results.
Knowledge and intelligence can be structures/processes that reflect how things are, so increase our knowledge and intelligence by obtaining more of these things (through reading, practice, etc). That is, they can be 'positive' things. But they can also be 'negative'. Critical thinking. Knowledge and intelligence that enables us to be critical, to not jump to conclusions, to not overgeneralise, to critically evaluate inferences. These negative elements are filters, they're avoid avoiding inaccurate knowledge/reasoning.
It's important to apply such filters. Many ideas come from intuition, and intuition doesn't work very well for many topics (e.g. science, technology and economics). We tend to get knowledge from what we hear/learn from others -- building new knowledge is very difficult and time-consuming -- and there's many inaccurate views out there (e.g. views that aren't accurate but are good at propagating amongst the population). There's many more ways to be wrong than to be right, and once a false view gets into your brain it can be difficult to dislodge.
Positive learning/practice can contribute to being critical, but it doesn't necessarily lead to that, and a critical attitude is somewhat of a parallel track that gets benefits aside from positive learning.
A purely-speculative notion is one without any supporting evidence. When a purely-speculative notion is invoked to explain some phenomenon, it's less obvious that it's purely-speculative. Think of pre-scientific explanations of the natural world, like the spheres many believed to be associated with epicycles. The fact that it's doing explanatory work makes it seem more real, and there is something of it being nested within an explanation that makes it more difficult to take a critical eye to. Such purely-speculative notions may even become a transparent part of how we perceive/conceptualise the phenomenon, such that the fact of the phenomenon's existence and character seems to provide concrete evidence of the existence of the -- in fact, purely speculative -- phenomenon.
In a subsequent post I clarified a couple of things. What I mean by evidence: if an explanation turns out to be false we can see there was never any actual evidence for it, just information that was suggestive to us, that we incorrectly took as evidence. Some explanations really are purely-speculative: they're just-so stories that only have going for them that if they were true then they would provide an explanation. And, that purely-speculative explanations aren't intrinsically bad. They are a necessary part of theorising and obtaining a true explanation. They're bad when the lack of associated evidence isn't recognised.
In a further post, I distinguished between a potential explanation (that demonstrates nothing more than a potential, and neither indicates that it is the explanation nor rules out other potential explanations), and an actual explanation (that demonstrates that this, out of all the possible explanations, is the actual explanation of the phenomenon). No amount of potential explanation evidence can demonstrate that something is the actual explanation. The inability to imagine an alternative explanation is often cited as evidence that a potential explanation must be the actual explanation, but it does not provide any evidence for this.When we draw a conclusion about X (an item, a situation, etc) we should do so by looking at the properties of X that are relevant to whether that conclusion holds or not. It is, however, very easy to take lazy shortcuts --- to jump to conclusions -- and draw that conclusion based on generalisations that apply to X rather than actually looking at the particulars of X. This is an example of that.
Instead of judging whether something is plausible by trying to imagine whether it could exist (the flaw in this is that positivist attitude), we should try to find reasons why it couldn't exist -- and if we can't, be agnostic about whether it could exist or not. If you need to take some action action that's dependent upon whether the thing exists or not, that action needs to go one way or the other, but even this doesn't require you to have a definitive belief.
Wikipedia's list of common misconceptions (direct link).
Things we don't understand always seem intangible.
The tendency to assume that things are as we perceive them. An example of this is assuming that since qualia seems inexplicable it is so.
We often wrongly attribute properties to objects that are actually the properties of interactions. For example, that something is addictive has to do with how our bodies/brains process it, not some intrinsic 'addictiveness' property of that thing. So non-physical things (like activities such as video games) can be just as much addictive.
A link to a 10 minute video on what it means to be open-minded (direct link).
On intuition. Distinguishing between learned-intuition and heuristic-intuition. Reason and intuition both have their place, but reason is better for determining which of these is better to use in a particular situation.
In cool reflection, it's easy to underestimate the force of emotions. When you experience emotions it affects you. But when you imagine such a situation they don't (or to a much lesser degree). This makes it easy to underestimate the force of emotions, and overestimate the ease of just doing what you want.
Motivation is often overlooked as a kind of emotion, feeling or desire. I have a brief look at what factors can influence motivation and why it's not a simple matter of "wanting to do" the thing.
David Deutsch's reply to the 2001 Edge question, arguing for the importance of looking for explanations for what we see happening (direct link).
"What is it in reality that brings about the outcome that we predict?" Whenever we fail to take that question seriously enough, we are blinded to gaps in our favoured explanation.
The test of a first-rate intelligence is the ability to hold two opposing ideas in mind at the same time and still retain the ability to function.Some posts related to that. I think it's an important skill to be able to reason about an issue without the implicit assumption that your position on it is true, and I talk a bit about what is required for doing this. It's good to understand views that you reject. Why does it make sense to others? Doing this can help guard against rejecting things simply because they seem "ridiculous" to you.
On the common but mistaken view that 'how much general knowledge you have' equates to how smart you are.
We think of size in relative-terms, relative to our own size. But this is very misleading. It's more accurate to think of something's size in terms of the amount of 'atomic' detail it consists of, and in such terms everyday objects like ourselves are very large.
Despite how perception presents the world to us, the actual physics (of "subatomic particles", so to speak) is everywhere and everything.
Just an interesting thing to think of: as well as the physical Earth, there is each person's personal conception of the world, the one they actually see and experience in their heads, and there are billions of these.
Non-experts can be better at solving problems than the experts - an example.
On the tendency to dismiss things
The natural world. Giant ocean waves were long dismissed as myths, despite the fact that they actually exist (as has now been definitively shown). Early reports of geysers weren't taken seriously.
Systemic Worldview
It's important to appreciate the kinds of 'systemic effects' that can make systems work in counterintuitive ways. Such as when incentives are poorly aligned. Some examples of this, to do with why enterprise software is so often poor. Another example - Bruce Schneier's notion of CYA (Cover Your Ass) security: "much of our country's counterterrorism security spending is not designed to protect us from the terrorists, but instead to protect our public officials from criticism when another attack occurs".
An example of systemic effects. In the article "Internet Killed the Alien Star", Douglas Kern contends that the UFO movement has been largely killed off because of the internet and other technological developments, such as camera phones.
Change is messy, and this is under-appreciated in practice. In 2009 Clay Shirky wrote "Newspapers and Thinking the Unthinkable". Newspapers are going away, it'll take some time to invent alternative venues for journalism, and the whole process will be messy (and how the history of printing press can help us understand this).
Chris Anderson article 'The Probabilistic Age': "Our brains aren't wired to think in terms of statistics and probability." (direct link)
A broader notion of affordances
Donald Norman's notion of affordances: the character of physical objects suggests how we can interact with them. I think there's value in a broader notion of affordances: the character of entities (whether physical objects, or things like mental concepts) shape how we perceive, think about, or do things.It's very easy to underestimate the affect of convenience and reminders in determining whether a person will actually undertake some activity, so we tend to underestimate the importance of affordances.
An example of software affordances: when there's a language for creating some data (e.g. a templating language for generating HTML pages), the degree to which the form of the code matches the form of the data to be generated is an affordance.
Some more examples of affordances, and the utility of having a very general notion of affordances.
Evolution
Our usual language for talking about evolution tends to be misleading, so I suggest some alternatives. And, prompted by a comment on that post, some elaboration.Most people only understand the definition of evolution, which by itself provides a misleading picture of it. To be able to think sensibly about evolution requires understanding a bit about what evolutionary processes are like in practice.
Some miscellaneous things relating to evolution:
The Copenhagen Interpretation of quantum physics and the theory of evolution seem fundamentally at odds with each other, and the latter has much better evidence in favour of it.
Could our 'inner voice' have evolved as a way to re-use existing machinery (for processing external voices) for a general-purpose intra-brain communication mechanism?
Reductionism
'Reductionism' is very often misunderstood. I try to clarify what it really means.The whole is greater than the imagined coming together of the parts: usually when it's claimed that "the whole is greater than the sum of the parts" in some situation, that claim isn't based on a precise, concrete understanding of what the "sum of the parts" really is, but more just on an intuitive impression of what the "sum of the parts" would be. This is another flawed reason why people thing reductionism is wrong.
Laddered skills
It's difficult to imagine higher-rungs above our skill level, which intuitively makes it feel like they don't exist. It's difficult to imagine how we'd reach a higher-level, which makes it feel like there isn't a point in trying. And it's difficult to imagine what having a high-run ability would enable, which makes it feel like there wouldn't be a point anyway. But these are just limitations in our abilities to imagine, which we should expect to experience and which we should ignore.
Acquiring knowledge is acquiring a skill. As we learn a skill we refine our abilities, and develop fluency. I suggest that acquiring knowledge is literally a matter of acquiring skills, and follows this same pattern.
Being able to explicitly reason about a concept is a skill. One rung on the ladder of cognitive skills is being able to explicit reason about concepts. Doing so requires being able to override intuitions and a fair bit of practice. It's an important skill for being able to draw reasonable conclusions about a topic, which also means that non-fiction books arguing for a particular conclusion should pay heed to this and try to help the reader develop skills in reasoning about the concepts involved.
Computing
Minor:
Simulations, for teaching learner drivers, could present them with the sorts of unusual circumstances that can cause accidents (e.g. another car suddenly behaving erratically)
Is there a way to incentivise and support summaries of threads on discussion sites like Reddit?
Would it be useful to have a "Stack Overflow" for finding research material in a subject area? It could help people from outside a field (including those trying to get into the field) to find out about relevant research in the field, from people in that field. I discuss the issue of incentives for contributing to such a resource. Perhaps this need has been, since writing this post, served by Quora or some subreddits. I'm not sure.
Immersive language teaching. 2004 article on software developed by the University of Southern California to provide an immersive virtual environment for learning language in a more natural way.
Using Wikipedia as a source of canonical tag names. To get the canonical tag name for a subject, get the URL of the wikipedia page for that topic and use the right-most part of it as the tag name. e.g. Ten-pin_bowling from http://en.wikipedia.org/wiki/Ten-pin_bowling
User-interface
Minor:A user interface principle: operations should be applicable to a type of information, regardless of the context it appears in. If an operation can be applied to an X, it should (unless there's a good reason not to) be able to be applied to Xs anywhere they appear.
Visually referencing web-page elements: we could provide better support for users to provide visual references to elements of webpages or applications, to help in writing instructions for using them. To enable showing rather than telling.
Some minor ideas for operating-system user interfaces. For switching between windows: hit a shortcut key then gesture mouse in direction of desired window, or have the shortcut key bring up a small schematic map of desktop at mouse-pointer location. For file-explorers, have an optional 'Recent Folders' navigation pane. In the top-half of the left-hand pane, it lists the folders you've recently navigated or moved/copied files to.
Dealing with files in a higher-level fashion: not needing to specify a name or location for the file, and not needing to explicitly save them. The posts go into some specifics of this: Sep 2007, Jun 2008, Mar 2009. This is an area where the software landscape has changed since writing these posts, with the way things like Google Docs and mobile apps operate.
Different communication mediums have different affordances. One dimension these affordances range along is from encouraging more lightweight and transient content (twitter, chat), to encouraging content that's more substantial and longer-term (e.g. email). Would it be useful for this to be explicitly acknowledged by the software tools, such that a conversation could be 'promoted' from a lighter-weight medium to a more heavier-weight one, when that made sense, where by this transition was seamlessly handled and tracked by the software?
A couple of thoughts on what the smart-phone analogue of Vim-like modal editing would be.
Fan menus: an idea for taking the combining the best parts of radial/pie-menus and conventional context-menus. The menu rendered like a conventional context-menu, with an additional radial element for more quickly selecting between the menu items. (After having prototyped this, I don't think it quite works, but I do have an idea for another variation that might work better. I intend to write it up sometime).
A few basic ideas for weather forecast sites/apps.
Copy-editing marks and annotations. The Signal vs Noise blog argues that copy-editing marks are better than how 'track changes' edits are rendered, and I see this as an example of our current software's generally poor handling of annotation.
A couple of user-interface ideas. 'Highlighting equivalent links' in the browser: if the user mouses-over a link, and the page contains any other links pointing to the same destination, highlight those as well. Hold-and-swipe touch-screen gesture - an advanced gesture that could be utilised on touch-screen devices: the user holds their thumb on the screen while swiping with another finger.
Idea: personal subject-lines for emails. Instead of seeing software as presenting an objective view of information, see it as a personal tool for the user, suited to how they see things and how they want to use it for the tasks they're trying to achieve. For example, instead of an email inbox simply showing the subject-lines of the received emails, why can't the user change what it says for a subject-line, if that made it more informative for them in the context of how they're going to use that email, or where it'd make it easier for them to find it in the future?
Magic Ink: Information Software and the Graphical Interface, by Bret Victor is a powerful argument for why, in designing software, greater emphasis should be given to graphic design and less to interaction. All his writings are quite good, and worth a read.
Visualisations
These aren't necessarily computing-related, but it made sense to me to put these next to the user-interface section.Minor:
Cecilia Burman provides an effective way of showing what prosopagnosia (face blindness) is like
"If individuals were rocks, then it's like having to remember the characteristics of each rock, and try to realise when you come across this rock again, from your memories. The page uses photos of different rocks so you can see for yourself the sorts of difficulties involved."
An antipodes map someone has made: select a location and see where the exact opposite side of the world is located.
Lokesh Dhakar's visual catalogue of types of baseball pitches. Each picture is a simple graphic that intuitively captures the essential character of a type of pitch.
Video of Hans Rosling showing some impressive visualisation of statistics using the Trandalyzer tool. Graphs are used to show the statistics, but interactivity and animation is used to show changes over time, to transition between related perspectives, and to drill down into more detail. There's an impressive flow: one perspective raises certain questions, so he modifies the view to try and get insight into them.
Imagine being able to visualise the exact location of a piece of meat (e.g. a steak) within a 3D rendering of the animal it came from. Or within an animated rendering of the animal, as it's grazing around in a paddock. As a way of making visible a reality we know is true but can't see.
Taking before-and-after shots of things like house construction from the exact same positions and angles, would enable better comparison.
Links to some visual illusions. A face that looks calm when viewed up close, but angry when seen from a distance. Invisible changes: images that are continuously changing but which appear static. A large collection of various optical illusions.
Extending cut-and-paste
Being able to view and interact with the clipboard contents in a window, and being able to edit these contents (and the motivation for this).
Storing clipboard history. Being able to select multiple items in clipboard contents and history. Being able to 'peek' at the next paste item.
How more complicated cut-and-paste operations could be specified, by using CTRL+SHIFT to add modifier keys. How such operations could be specified for text copied from context-menus such as "copy link location" in a web-browser.
Controlling where in the clipboard list to cut/paste to/from, and whether a cut/copy replaces an existing item or inserts a new item, and whether a paste leaves the item in the clipboard or not. Cut/copy the item at the end, or at a specific position (either replacing the item already there, or inserted as a new item there). Paste from anywhere in the list, and 'pop-pastes' that remove the pasted item from the list.
Template paste: paste a 'template' containing the contents of multiple clipboard items.
Formatted paste: when pasting from rich-text documents into plain-text ones, be able to automatically convert rich-text formatting (bold, italics, etc) into plain-text equivalents (e.g. *bold*, /italics/).
Thinking about software
Software is a tool, and we can't predict exactly how users will use tools (a quote relating to this, and an example). We should acknowledge this in the design of software. I believe that, ideally, a piece of software would be a program written in it's own very high-level domain language. For example, the WinAmp program you get would be a program in the WinAmp domain language. The grammar of this language would define all the ways we can compose the components of that domain language into valid WinAmp programs. That is, we can customise the program by changing its high-level source code.
Some Alan Kay quotes on computing as the first 'metamedium' that can 'dynamically simulate the details of any other medium'.
Programming
Minor:Even if you're a solo developer, programming is always a team activity. The other members of the programming team the future versions of yourself 6, 12, 24.. months from now. A reason why you should always write code for others to read.
Defining functions within functions: if function A is only going to be used within function B, the code could be made clearer if a language allowed function A could be defined within B (at the start of it).
Sequences of lines of code may form a logical unit, but be too lightweight to break out into a separate function. We may want to document the overall function of such sequences, or even just indicate that they form a logical unit. Comments may be used, but there's no good way to specify the span of lines a comment applies to, so it could be useful to have a lightweight way of indicating that a sequence of lines forms a logical unit, and the post discusses how this could be done and how it'd enable those units to be 'collapsed' to provide a more concise view of a function.
When an experienced programmer is trying to learn a new language, what's the most efficient way to present them with what they need to know? I suspect it's to provide concrete examples of the features, which show the syntax and the results you get for particular inputs. Essentially, a collection of simple, representative examples, showing the exact inputs and outputs.
Teaching Programming
An idea: using a game to teach regular expressions. For example, use regular expressions to attack oncoming patterns of enemies.Against 'simple' as meaning 'fewer elements'
To explain this viewpoint, outlining the notion of 'qualitative perceptual concepts', and arguing that complexity is one of these concepts.
Local vs global simplicity. What can make some software seem simpler, from the perspective of that software by itself (at a 'local' level), can actually make it more complex to use in practice, in the context of the user's use of it and other software it's used in conjunction with (at a 'global' level).
Initial vs standing simplicity. How simple the software is initially, vs how simple it is for an experienced user.
Looking at the value of potential new features in isolation, rather than its potential benefit as part of the entire system, is a reason that software can become too complex.
Others have argued against 'minimal features' view of simplicity, and for simplicity as primarily meaning the software suits the user's perspective and tasks: Joel Spolsky (who also brings in the notion of 'elegance' meaning 'grace and economy at achieving some task'), and Dharmesh Shah (1) and (2, which also argues that software should be 'opinionated' but not 'stubborn').
Avdi Grimm teases out the various different sorts of simplicity, and the tradeoffs involved in each type.
More choices may mean more complexity, but there is some nuance to this. How many choices you need to deal with at once is the bigger issue.
Education
Minor:A goal for education: giving people confidence in their learning ability.
On Producing High-Quality Work
AVC: How do you establish that kind of creative control?
RG: I just demand it. I just simply wouldn't do anything that I wasn't terribly in charge of. I don't let anything go.
Philosophy
Just because something has always been a philosophical question doesn't mean it will always remain so. We may obtain a concrete understanding of the matter, whereby it will become science.There's learned philosophers but not philosophical experts. Philosophy concerns what we don't have a clear understanding of. It involves discussions and arguments and all the back and forth between these. Progress is in clearing up confusions, and making finer points. This means there's no established knowledge to have expertise in. Instead, being an 'expert' means being more learned in the ins and outs of the web of philosophical discourse. This has significance for the potential for outsiders, not part of the philosophy discipline, to make contributions to philosophical topics.
On writing, presenting, researching, getting things done
Paul Graham's essay "Persuade xor Discover": writing to persuade and writing to discover are diametrically opposed, which is important to realise if you wish to discover (direct link).
Some Signal vs Noise posts on communication: on usefulness of counterintuitive-seeming statements, for getting people's attention; on how "the Curse of Knowledge" makes communication difficult; on describing a slice instead of the whole pie'; and on what being a speechwriter is like, and how it's similar to doing graphic design.
A list of authors I think are good at non-fiction communication (written in 2005).
Richard Hamming's advice on doing research - "You and Your Research" (direct link). I tend to find most advice on doing research a bit ordinary, but I thought this was quite good.
A Signal vs Noise post 'Do it yourself first', on why it's really beneficial for you to have tried to do a kind of work in a company before hiring someone to do it (direct link).
Misc ideas
Imagine a large coffee-table book, with one page for each country in the world. For each country's page, imagine a single large photograph. Think of something like a high-school class photo, but with a larger number of people in it -- something like 200 (instead of four or five rows of people, maybe there seven or eight). Where those 200 people have been chosen to represent the ethnic diversity of the people in the country, and something of the variation within each of the main groups (perhaps 200 people wouldn't be sufficient, maybe 300 or some other number would work better; and yes, whatever set of individuals was chosen would involve some degree of controversy, but there is positive intent behind this idea and some degree controversy shouldn't be a roadblock). Obviously, any such thing would be a major approximation of the actual range of people in the country, but the idea is that even a book containing such major approximations would give most people a greater sense of the diversity out there than they currently have. Personally, I think such a thing could be quite interesting. It's obviously something that could be done as a web-site, too.
Allowing wiki-style edits on every single web-page. A potentially practical way for this to work, and why it could be useful.
An idea for a multi-touch rhythm game, where player needs to perform actions like tapping at three positions on the screen at once, or tapping at two locations, then sliding their those two fingers along a path.
Reframing how digital content is sold, from you paying for the content itself to paying for the effort that went into creating it.
Minor:
As an alternative scheme for presenting footnotes, write the main body of the text as if it had no footnotes and at the end of each chapter have a 'footnotes' section where, for each footnote, there's a recap of the details from the main body of the text that it is for.
It'd be interesting to have a documentary that covers the differences between fictionalised accounts of police or medical work and the actual character of the work in those fields.
Smartphone app idea, "Battlechat": a competitive real-time text chat game.
Remote-controlled cars or drones could be used as like a Logo-turtle, for teaching programming concepts. There could be more of an emphasis on navigating the physical environment.
Misc points
Chemicals can be completely natural. On the mistaken notion of 'chemicals' amongst the general public.
Strong leadership of the JFK "moonshot" sort might be what's needed to tackle climate change. One incentive for this is it's an opportunity to be seen as a hero to people in other countries and to future generations.
An analogy to show that, even though qualia seems so inherently mysterious, it need not actually be so: imagine a person in a remote tribe who knows nothing about modern technology, who is given a handheld games console. Nothing from their knowledge or experience could suggest what the game world is or how it might work.
A short quip: "Unseen practice is easily mistaken for brilliance".
Misc
Jaron Lanier writes about the anti-intellectualism in today's society, the need to overcome it, and suggestions on doing so. One of the topics he focuses on is spirituality and how that can mesh with a naturalistic worldview.
Daniel Dennett's policy recommendation that every child is taught about all world religions (direct link)
Toxic religions depend on enforced ignorance of the young -- and a religion that can flourish in an informed (citizenry) is a benign religion.
I agree with the sentiment in Chris Mooney's article "The Monster That Wouldn't Die" (direct link)
I'm tired of preachy retreads of the Frankenstein myth, first laid out in Mary Shelley's 19th-century classic and recycled by Hollywood constantly in films from Godsend to Jurassic Park. I'm sick of gross caricatures of mad-scientist megalomaniacs out to accrue for themselves powers reserved only for God. I'm fed up with the insinuation (for it's never an argument, always an insinuation) that there's a taboo against the pursuit of certain kinds of knowledge and that certain technological achievements -- especially those with the potential to affect life itself -- are inherently "unnatural."
...
I'm extremely uncomfortable with the way in which the weapon of the Frankenstein myth is repeatedly used as a club against modern-day medical researchers, who are seeking to cure people, not to become God.
Kevin Kelly looks at how, when copies are free, you have to sell what can't be copied (direct link). He goes into the sorts of non-copyable things that can be sold. I outline some thought about this in terms of the view of technological development as the removal of constraints, and on why that view of technological development is useful.
Related to my PhD
On distinctions made between 'data', 'information' and 'knowledge'. What we really need is a better understanding of the phenomena. It isn't very helpful to try to make sharp distinctions when our understanding is lacking.
A grab bag of things with some relation to my PhD work:
Paul Graham on discarding the assumption that it's all about us (direct link). I believe this is key to understanding the foundations of information and semantics.
if you want to discover things that have been overlooked till now, one really good place to look is in our blind spot: in our natural, naive belief that it's all about us.The computer revolution as a switch to imperative descriptions. In "The Structure and Interpretation of Computer Programs", Abelson and Sussman write:
The computer revolution is a revolution in the way we think and in the way we express what we think. The essence of this change is the emergence of what might best be called procedural epistemology--the study of the structure of knowledge from an imperative point of view, as opposed to the more declarative point of view taken by classical mathematical subjects.(Wolfram argues similar points)
How explicit does a representation have to be? Is there really a hard distinction between explicitly and implicitly representing some details? I suspect not.