Hacker Newsnew | past | comments | ask | show | jobs | submit | _zh9y's commentslogin

I wonder who designed this website? This looks great, from a visual standpoint.


What do you like about hey.com?


Thanks! This comment made me happy :-)

I do think about how photography’s claim to authority - its claim to represent some objective reality we all live in - is being questioned by the prevalence of filters and photoshop on social media.

It is not that social media is a digital reflection of the physical world, but more of an alternate reality in which we all have digital avatars that may or may not correspond to a physical self.


I like the design of your website!

What do you mean when you say words are disentangled, standalone concepts? I see words as being very much related to each other.

I assume I may be misinterpreting what you mean by "disentangled, standalone concepts”.

Barbara Tversky's research seems to contradict linguistic relativism. I definitely don’t think language is the foundation of cognition.


Thanks!

Words are considered a "discrete unit of meaning", i.e. 3/4 of a word doesn't really mean much. So words like "red" and "grass" are "standalone" in the sense that the mean something by themselves. I agree that words are very much related to each other, in the sense that you can combine them.

I was trying to draw a connection that the "disentangled representations" ML folks often talk about are but a special few-word case of grammars for combining distinct concept.


Unfortunately, words aren't that simple, but it's close. Prefixes, suffixes, in-fixes, endings, etc., all have discrete meaning as well. And going into Asian language, this is much more obvious.

The discrete unit of meaning level is generally somewhere between a syllable and a word, with a few exceptions for shorter modifiers.

Unfortunately, in linguistics, the concept of a "word" is only as well defined as "planet" was pre-pluto losing its status.

Similarly when you look at riddles and crossword puzzle clues the idea of words being discrete also falls apart. Words, very much like variables in algebra only have meaning in relation to the other pieces of the context they are attached to.

While the mechanics (all the pieces of language, syntax and semantics are not discretizable. Just talk to anyone working on a dictionary.) you talk about don't seem to hold, I do think the idea you're talking about does hold.


Fair enough, I agree that if we really examine the comment "word as a discrete unit of meaning", the edge cases start to accumulate and the semantics rapidly break down. But barring things like prefixes/suffixes/modifiers/composite word characters in traditional Chinese, words are fairly discrete and generally regarded as the primary layer for expressing singular units of "meaning"


They are, but only because we don't have better language to express them. Similar to a lot of the problems with Chomsky's works the composability of language is only a subset of the whole breadth of what is expressable in a given language.

Or in other words, I believe the surface area of "edge cases" has a similar surface area as the rest of the language. The difference being they aren't invoked nearly as often because they require more effort and creativity.

Just look at the rise of words like "hangry". There are types of mashups that show up in creative uses of language that defy nearly any rule for any language you can come up with. In many languages, if you choose any of those supposed rules you can probably construct an algorithm to generate odd, but understandable words that defy that rule.


> Or in other words, I believe the surface area of "edge cases" has a similar surface area as the rest of the language. The difference being they aren't invoked nearly as often because they require more effort and creativity.

Edge cases or exceptions do tend towards being highly used; this is because language is more likely to change the more it's used, so the most highly used words/phrases/sentences/etc tend to accumulate changes. One example of this is that if a language has verb conjugation and irregular verbs, then odds are some of its most common verbs will be irregular.

> Just look at the rise of words like "hangry". There are types of mashups that show up in creative uses of language that defy nearly any rule for any language you can come up with. In many languages, if you choose any of those supposed rules you can probably construct an algorithm to generate odd, but understandable words that defy that rule.

There are rules for that that would work, weirdly enough. There are just a ton of them.


My only point here is that any framework for generalization needs to be able to account for and incorporate these kinds of "exception-seeking" cases. Similar to the same way that mathematics uses counter-examples to strengthen and reinforce the definitions chosen.


I agree with your comment "In many languages, if you choose any of those supposed rules you can probably construct an algorithm to generate odd, but understandable words that defy that rule." - it comes many forms, from Goodhart's Law to the "hot dog vs. sandwich" debate.

I do mention this in my blog post - although I think Generalization is Language, I don't think it's possible to create a formal framework of language, for precisely because of "adversarial examples" that can be supplied for any formal definition.

Natural language itself, ignorant of formality, is able to account for these exceptions insofar as language is sufficient for people to convey a bare minimum of meaning. I am proposing to define language and generalization via the implicit understanding of large language models, in the same way you might use an image classifier to define "cat images" or "hot dogs"


Hmm, I can understand the motivation. However, I feel it either won't work or will be very fragile because it's already part of the model because they're trained using natural language.

DL is already far from formal models, that's why deep learning “works.” And even at the current level of DL models, those exceptions are represented to some extent.

So ultimately, your idea is to push the models toward further generality, which in my option, will bake these “exceptions” deeper into the model.

And my question is, what does that mean for your idea? In my mind trying to exclude them would break what works. On the other hand, ignoring them means you can't direct development towards your goal because there’s no map from language to generalizations, so that you would be relying on random chance for progress.

If this is off in left field, let me know, but that's what I can see from your description.


The problem with "word", as with many terms in linguistics, is that it's a prescientific unit of analysis.

I certainly think most linguistic typologists would say that there is no cross-linguistic unit that corresponds to our intuitive understanding of word, which is really grounded mostly in orthography.

And I think it's fairly easy to show that orthography should not have much say in this matter, though. Of course you can't get around it in language didactics, but in scientific description we need to be very careful with it. Bob Dixon and Alexandra Aikhenvald give some examples from Bantu languages in their Word: A cross-linguistic typology. In Sotho, the sentence "We will skin it with his knife" is written "Re tlo e bua ka thipa ya gagwe", while in the orthographies for Zulu and Xhosa, the same sentence would be rendered as "Retloebua kathipa yagagwe". You really need to look at each language to find a sensible set of analytical categories, and be very explicit about your criteria, be they syntactic, semantic or phonological.


Linguistics has the distinction for what you're talking about: Morpheme versus word. Morphology is the study of this area. I freaking loved my Morphology classes.


While I think there's a generally accepted definition of morpheme (as the smallest distinctive unit), that doesn't give you a good definition of the word. (Because there isn't one.)

Funny you use the term morphology like that. To me it's basically synonymous with inflection, very traditional, where morpheme is very much a structuralist term. But all my teachers were cognitive-functional linguists, so everything was cut rather different and sometimes it's hard to talk.


Yeah, my morphology teacher was a structuralist, and this was quite a while ago, so I have no doubt I'm biased there. (I actually preferred the cognitive stuff I was introduced to; I really liked working with metaphor in their systems and syntax/phonology/morphology were less my thing than semantics and sociolinguistics.)

You're definitely right that the definitions aren't cut-and-dried and that makes typology rather difficult.


And there’s also multi word expressions (MWE), where the meaning of the whole is different than that of the sum of its parts. E.g. “out of the blue”, “bite the bullet”.


Yup. Going the other direction is a thing as well.


Actually, the discrete unit of meaning, linguistically, is the morpheme. It's a small difference, but it matters. Some words are morphemes, but not all, and not all morphemes are words.

Language, man. It's weird.


I can see how this could work in English. I’m not sure if there are other languages in which 3/4 of a word carries more meaning. (I’m a primary English speaker, so this concern could be unfounded.)


In many languages you have literally 3/4 of the word carry the meaning of the actual word and the remaining 1/4 sounds or letters devoted to grammatical markers for the gender/case/number/etc.

Using a classic Latin example from Monty Python, Romani ite domum / Romanes eunt domus;

the "Roman" part of of Romanes/Romani actually carries very much meaning and the -es/-i has information that's largely orthogonal to that.


All languages have something analogous to words in this way, although it can be hard to know where to draw the boundaries sometimes.

Technically the smallest indivisible unit that bears meaning is the morpheme, not the word. For example the word “cats” in English consists of two morphemes, cat+s. The first morpheme can stand on its own as a word, but the second can’t.


I agree, but I think the trickier part is that the semantics of words are even blurrier/more ambiguous than the syntax.


Yeah, hence the turn away from dictionary definitions and things like WordNet towards continuous distributional vector representations in NLP.

I don’t think you could really give an uncontroversial symbolic definition for any natural word.


I agree - this article is a bit scary.


Objective truth does not exist, in my opinion, so I see this article as misguided.

Consider reading Manufacturing Consent:

https://en.wikipedia.org/wiki/Manufacturing_Consent


In my opinion, objective truth does not exist. “Fact-free argumentation" is good and desirable.


> Objective truth does not exist.

Besides this statement?


Well, enderm did say "in my opinion"...


> Well, enderm did say "in my opinion"...

Not when I posted, he didn't. That's why I my quotation of him is different than what's now in his post.


Oh I was writing the same comment, you beat me to it, and more elegantly!


I think therefore I am. And so the world outside can't be confirmed, only your own existence? I get it. But what if the more objective party, society, a collective of subjectives, disagrees with you and your existence? It doesn't matter what you think, you will not think or be if they deem your "truth" an untruth or a threat. I think it's at least for this reason that the ego should be somewhat tamed and not undermine the objective or else the subjective becomes more endangered.


>Objective truth does not exist

That's an objective truth claim right there!


Moreover, if this claim is true, then it's false. It therefore refutes itself.


Do subjective truths exist?


Stuttering is not a disease and does not need a cure.


> Stuttering is not a disease and does not need a cure.

I stutter. I honestly don't understand how you can come to that conclusion. I do not understand how something that makes it more difficult for me to talk is not a disease and should not be cured.

I'm also autistic, and while I can see the arguments that autism opens doors as well as closing them in terms of giving me new ways to think that allistic people don't have, all stuttering does is close doors. I know what I want to say, I just can't get it out without exercising more care than other people.


People have a tendency to conflate adjectives with identity, and don't like it when that identity is threatened. Give it a couple of years and I expect to get messages on social media inviting me to join groups for people with persistent COVID anosmia, and a couple of years after that they'll raise hell if someone comes up with a cure for it. Blindness, deafness, paraplegia, nearly type of bodily non-function is now who you are instead of just something about you.


A very close-minded perspective IMO. Human beings are complex systems molded by genetic evolution and environmental pressures, as well as culture and technology. Being "deficient" in one area nearly always confers a benefit in some other.

If I were you I'd develop a personality that deals with the particulars of your existence in a positive and constructive way. So you stutter. That means your language and by extension thought is different and unique in a way most people couldn't even comprehend. What can you learn about the universe? What kind of system of thinking and self-expression will you develop that works around your bottlenecks and channels your strengths?

> I do not understand how something that makes it more difficult for me to talk is not a disease and should not be cured.

Consider, for example, games. A game that makes it more difficult for you to win is not a disease, does not need curing. It is a challenge that ultimately leads to self-improvement. Difficult challenges commonly begin with frustration and dejection; Neither of which will actually help you move forward. Figure out how to move forward, and you will discover your self along the way.


So... if I have a handicap, I should look at it as an opportunity or challenge? I should just rise-up to the challenge? Yeah, well, if I don't have any options, then I'll make do.

I really only see lack of empathy in your response. Stuttering is a handicap, plain and simple. Yes, some of us can live with it. It is debilitating and humiliating and if you do not suffer from it, I can't imagine how you can begin prescribing advice. You can have the universe, I just want to order some coffee (thank god for self-serve POS).


Calling anything "plain and simple" is an admission of knowing nothing about the subject. Nothing in the universe is plain and simple. Every single mutation, trait and feature of every single human being who has ever been born can be thought of as a handicap. Classically, even being perfect is considered one.

This is only a reflection of one's inability to play the damn game, instead adopting a victim mentality for pity points.

When I went abroad and didn't speak a single word of the local language, I still managed to order coffee just fine by pointing my finger at what I wanted, and smiling. There is nothing debilitating nor humiliating about any of it. Grow up and develop a personality.


> This is only a reflection of one's inability to play the damn game, instead adopting a victim mentality for pity points.

My life isn't a game.

Thanks for trying, but you're obviously unsuited to the task of having this discussion.


Maybe that's why you're suffering instead of enjoying it.


> Being "deficient" in one area nearly always confers a benefit in some other.

Not even close

> Consider, for example, games. A game that makes it more difficult for you to win is not a disease, does not need curing. It is a challenge that ultimately leads to self-improvement. Difficult challenges commonly begin with frustration and dejection; Neither of which will actually help you move forward. Figure out how to move forward, and you will discover your self along the way.

A game which increases the intensity of its challenges will make you better at the game. A game which randomly misinterprets your inputs will not.


> A game which randomly misinterprets your inputs will not.

You clearly know nothing about games. This specific kind of challenge (called 'output randomness') is widely employed in games and it absolutely makes you better at carefully considering what you're doing (because it might go wrong), making contingency plans (inevitably it will go wrong), efficient encoding of intent (maximize the outcome despite going wrong), and priorization (dedicate more effort to the important first).

The results of getting better at this kind of thing can be seen in telecommunications, where the unreliability of links has required very smart people figure out how to communicate over them anyway. And our networks are more robust because of it.


Please enlighten me on the games that intentionally implement misinterpreting player inputs. Accusing me of knowing nothing about games is a pretty reckless claim. Think about how fucking stupid that kind of thing is to claim of a random stranger. Games aren’t even a niche subject area,

“Output randomness”, as in, rolling a d20 to see how your attack goes is not remotely close conceptually to the idea of a game that intentionally misinterprets your controls.

If you choose to attack, your attack might not succeed. But that’s not a good analogy for a disability. Those are the rules everybody plays by. A disability like a stutter would be you decide you want to attack but then the game decides you’re going to use a useless item instead.


Gladly.

Traditional roguelikes sometimes implement a spell or effect of 'confusion' that causes the symbols by which you read the game world to shuffle around randomly, thus obscuring the information of what is what. Then attempting to use an item you thought was an X turns out to be a Y instead. Many of them have items or spells with inherently uncertain outcomes, like wands of random effects.

The modern traditional roguelike Cogmind has a mechanism where a certain type of damage causes corruption to accumulate, causing your character to increasingly perform random actions instead of the actions you input, such as randomly triggering weapons to fire at random targets. Skilled players take this into account to avoid catastrophic accidents, for example by dropping all their weapons before talking to a friendly NPC.

The tactical/strategy game XCOM models the psychological stress of combat operations as a chance that a soldier will inadvertantly reject your order and do something else instead, in panic. Darkest Dungeon leans heavily on a similar mechanic, where accumulating stress or negative personality traits can cause your party members to act against your command, on their own volition, messing up your plans, necessitating the organization of actions around the least reliable elements.

Deck building card games in general present an inconsistent and unreliable set of options at any given moment, as your available actions are a randomly drawn set of cards. Despite your best efforts to include tools for dealing with situations, you can never rely on a particular tool being available when it's needed, leading to heavy planning around probabilities, redundancy, and flexible card combinations.

Since your understanding of the subject is "rolling a d20 to see how your attack goes", I would say my accusation is perfectly well placed. No need to get defensive about it. I've been researching games for decades. The problem is when you make factual statements like "a game which randomly misinterprets your inputs will not make you better at playing it" (paraphrased) as if you are an expert on the subject, when absolutely clearly to anyone who has studied games to any length can immediately see that your claim is false.

Let me know if you'd like more examples, these are the ones off the top of my head.


None of those are misinterpreted inputs. Those are clearly defined mechanics within the games that have a chance to have an unexpected outcome but in exclusively designed contexts.

I don’t see any of them making you better at the game either, but rather just being a context that you need to adapt to. Dropping all weapons before talking to someone because there’s a chance you might attack them if you don’t isn’t making you better at the game, it’s adding tedium to circumvent a risk that is only present due to your confusion state. If you had no such confusion, this behavior would just be a waste of time. You’re not better at the game generically for doing this.

Also fwiw that sounds like an incredibly stupid game mechanic in cogmind. None of your other examples felt remotely relevant to me.


> None of those are misinterpreted inputs. Those are clearly defined mechanics within the games that have a chance to have an unexpected outcome but in exclusively designed contexts.

Video games are computer programs. As a programmer, you have to clearly define the rules to the computer, otherwise the computer will refuse to run your code. Programming unexpected outcomes is by necessity defining clearly probabilities and consequences of failure. How else would you even do it? Actually corrupting the program's memory? Yeah good luck with that.

All I hear is "I don't understand this so I think it's stupid so you must be stupid and these games must be stupid as well."


If disabilities were that good, people would be lining up to give themselves one.

I am 10,000% "ableist" despite having a recognized disablity because having a disability effing sucks.


Funny you should say that, because it's somewhat common for people to engage in self-ascribed challenges for self-improvement, including not speaking.

https://en.wikipedia.org/wiki/Vow_of_silence


Try going on a stage in front of your whole school and be stuck on repeat saying a single syllable for over 5 minutes while your friends try to do something in the background so they aren't just standing and waiting.

Stuttering might not be a disease but it definitely needs a cure.


Most of my professors in college were awful. I didn’t understand why.

I just read a great book - Teaching Community, by bell hooks - which helped me understand why my college education was of such poor quality.


Those what?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: