This seems to be a problem with the Stata code. Disappointing but not really surprising. Stata isn't an easy language to debug and problems like this happen a lot. I wonder how much coding was done by the original author versus being coded by a research assistant.
I think that some of the biggest issues with omitted variables could have been address by using the "new" Stata interaction syntax ("new" in the sense that has been available in Stata since 10 releases ago) instead of rolling their own products and forgetting to include the main effect.
Regarding public comments, I don't believe a good politician will make a snap decision at the dais following public comments. Most of them will have received the meeting agenda in advance and formed an opinion about how they are going to vote and the questions they are going to ask. If this is the case, public comment is just a waste of time for them, as they won't really get swayed by it. At most, they will mention a point that a public commenter made to support something that they were already going to do.
Emailing them privately in advance of the meeting will give them the opportunity to think about your input and, in some cases, reply and engage with you about the policy. It might not change their mind, but it will definitely help them see others' perspectives on their upcoming decision.
In the best case, you're betting on some subset of whatever board showing up not having counted the votes they need for their preferred outcome. If something's on the agenda, it's because the board wanted it on the agenda!
The ironic part about these hallucinations is that a research paper includes a literature review because the goal of the research is to be in dialogue with prior work, to show a gap in the existing literature, and to further the knowledge that this prior work has built.
By using an LLM to fabricate citations, authors are moving away from this noble pursuit of knowledge built on the "shoulders of giants" and show that behind the curtain output volume is what really matters in modern US research communities.
That's going to be the philosophical question of our times: do LLMs generate slop out of nowhere or does it simply amplify the slop machinery that was already there?
Neither Python or R does a good job at all of these.
The original article seems to focus on challenges in using Python for data preparation/processing, mostly pointing out challenges with Pandas and "raw" Python code for data processing.
This could be solved by switching to something like duckdb and SQL to process data.
As far as data analysis, both Python and R have their own niches, depending on field. Similarly, there are other specialized languages (e.g., SAS, Matlab) that are still used for domain-specific applications.
I personally find result preparation somewhat difficult in both Python and R. Stargazer is ok for exporting regression tables but it's not really that great. Graphing is probably better in R within the ggplot universe (I'm aware of the python port).
I set up something similar at work. But it was before the DuckLake format was available, so it just uses manually generated Parquet files saved to a bucket and a light DuckDB catalog that uses views to expose the parquet files. This lets us update the Parquet files using our ETL process and just refresh the catalog when there is a schema change.
We didn't find the frozen DuckLake setup useful for our use case. Mostly because the frozen catalog kind of doesn't make sense with the DuckLake philosophy and the cost-benefit wasn't there over a regular duckdb catalog. It also made making updates cumbersome because you need to pull the DuckLake catalog, commit the changes, and re-upload the catalog (instead of just directly updating the Parquet files). I get that we are missing the time travel part of the DuckLake, but that's not critical for us and if it becomes important, we would just roll out a PostgreSQL database to manage the catalog.
Because applying for a visa takes money, time, and a visit to the embassy.
ESTA/ETIAS gets automatically approved within a few minutes of paying for the fee (I guess this is true for 99.999% of applicants).
Very few countries allow people to just show up and cross the border. US citizens had that privilege in a lot of places, but it looks like it’s changing now.
I have never visited an embassy to get a visa--though I did cancel a couple of business trips when it became too much of an effort because of timing relative to other trips. I've travelled to a bunch of countries where I could just go through immigration with a US passport or maybe pay for a visa on arrival.
The hunter-gatherers in the study lived in the "Late Holocene (~4000 to 250 BP)", meaning between 2000 BCE to 1825 CE. These people are separated from us by less than 150 generations. I don't believe that humans evolve that fast, so the way you think, feel, ache, and so on also applies to them. Would you leave behind your injured and disabled in their situation (which is speculated to be the result of hunting accidents)?
Anthropology started at a time when people thought civilizations evolved in a straight line from savages to England. But it's hard to pretend that the natives sat around a rock grunting at each other when their e.g. bone-setting techniques were essentially modern, so there's a tradition of "not as benighted as you might have thought" articles.
WHY that point of view still exists is a question every anthro novice asks, and it turns out that cultural evolution is too attractive an idea for some people to let go of.
Seems crazy to me, given anyone with children that is exposed to multiple languages can easily imagine how complex the language scene must have been in humans that did not write, given how easy and natural it is for little ones to pick up different languages that they speak with different people.
Most likely even Heidelbergensis had "complex grunting" and hand signs so humans in the neolithic are effectively identical to us in language capability.
Go ahead. Invent some new tech that absolutely no one know about or how to do and that isn't based on any known tech. I'm waiting. What's taking so long?
Discovering stuff is hard and harder if you don't think you need it. People kept fire going before they knew how to start fires. If you don't know about the concept of flint or lighting dry stuff with sparks, it is really hard to invent fire starting. Writing isn't as useful if you can just learn what you need to know while growing up. A more complicated world later - as are discoveries slowly started to build up - probably created the need.
But again, those discoveries are hard and they took time. A really long time, apparently.
I think there is a tendency to project the modern era's speed of technological progress back in time, which isn't reasonable. We went from the Wright Brothers to Apollo 11 in 66 years. The first transistor to the iPhone in ~60 years. That rate of development is...new.
Why is it written BP? These archaeology people / Phys.org really need to cease with that confusing nonsense. BP is supposedly "Before Present" or "Before Physics" modern referring to practical radiocarbon dating with a cutoff date of January 1, 1950. [1] Way too easy to transpose BCE / BC / BP.
It's written like these people were supposedly cave people, yet based on this story's confusing usage, these people were caring for each other after the Spanish and Portuguese colonization of South America up to the 1700's. 4000 BP is the "really Late Holocene" 2050 BCE, 250 BP is 1700 AD. Also, the "late Holocene" goes all the way to Y2K (2000 AD). [2] The Meghalayan is the "the current age or latest geologic age." [3]
Evidence of animals doing this exists. Unsure why anyone would be surprised theres evidence of humans doing this.
It's really wild to me how many humans believe their feelings are so different from animals. Most animals have similar incentives and desires, humans just have "better" tools to achieve them.
Grieving orcas have been found to move their dead babies around for many weeks. Chimps will fetch plants like Scutia myrtina, which is toxic in large amounts but acts as a anthelmintics (anti-worm drug), for fellow members of their group when they're sick. Elephants will defend their wounded and even bring food or water to them or help them stand up when they're struggling to.
Not sure why you're being downvoted. You're absolutely right. These types of behaviors can be seen all throughout the animal world. Especially for animals showing degrees of eusociality.
These feelings are also extremely important for the preservation of one's species. No wonder evolution took this approach in multiple occasions, animals tend to get lonely if they aren't kind to one another, leaving themselves open to get killed. The incentive is there to have strength in numbers, and being "emotional" in some ways contributes to further that.
The people they talk about are contemporary to the Babylonions who have already absorbed the urban Uruk civilization that started to peak a millenium prior. The difference isn't biology but resource density and climate favorability leading to higher social organization.
The costs and benefits faced by ancient humans were very, very different. Maybe a different way to frame the question would be "At what probability of additional death, injury, or suffering (to you or other tribe members) would you abandon your injured/disabled?" Humans of that era did not have anything even remotely approaching modern medicine and most lived at subsistence levels with starvation always at their doorstep. A huge portion of ancient peoples energy and time was dedicating to obtaining calories. That means caring for the injured/disabled imposes a huge cost and risk. We can just as easily find examples of ancient peoples murdering or abandoning their injured, disabled, and weak. I don't think it would be right or fair to judge them through a modern lens. Of course they cared for their loved ones and mourned their deaths. But they were faced with much harsher circumstances to which their cultures and beliefs were suited.
It would be helpful to provide some citations and evidence around the claim “ most lived at subsistence levels with starvation always at their doorstep”. There is an increasing amount of evidence that this was not the case.
> most lived at subsistence levels with starvation always at their doorstep
Genuine question: is this something we know from evidence, or an assumption? I vaguely recall having read that comparison between skeletal remains of early farmers and hunter-gatherers indicated that the latter had a better diet, but I'm not sure if I'm remembering correctly or how much that observation generalizes.
We actually have a ton of evidence refuting this. The two things anthropologists spend their whole time rejecting in popular sciences is the barter myth and the idea that hunter-gatherer lives are "nasty brutish and short".
The nasty brutish and short idea might have been true about many medieval European peasants but the rest of the world wasn't cramped up with livestock and poverty conditions with poor sanitation. Other people simply didn't face as much disease. There was actually some really interesting work in bioarcheology in 2018 that showed that even extremely long lifespans was not actually that rare.[0] And those who made it to adulthood could generally expect a long life (obviously tons of variation here). In the city of Cholula, Mexico, between 900 and 1531, most people who made it to adulthood lived past the age of 50.[1]
Not to mention the famous "Man the Hunter" symposium where Marshall Sahlins introduced the Original Affluent Society Thesis which has since been largely upheld and reinforced.
Both early farmers and hunter-gatherers regularly endured calorie scarcity. The difference between them along this dimension is minor compared to the difference between either group and us and our calorie security.
you have a point actually. Non-agricultural people had much more varied diets and we have almost zero archeological examples of famines leading to mass deaths of non-agricultural peoples but we have plenty of examples of that happening to agricultural people. Agriculture was, especially initially, a huge step back in food security.
Obviously things have changed a lot since then but some of the risks remain. Cuba is a fascinating case study for what happens when a modern agricultural supply chain can collapse (due to US sanctions). Many many died. But since then there's been a massive focus on locally grown food and even wild tending. I know many people who are into permaculture and alternatives to industrial agriculture who have traveled there to study
This feels like video game analysis. Unit is likely to die, therefore do not spend resources on unit. Leave unit behind.
There is no world in which I would leave a family member or close friend to die in the woods alone, especially if I have no idea what germs are, why people die when they bleed, and am listening to a voice I have heard my whole live cry out in pain. Even if I knew for sure they were going to die, I would sit with them, or move them, or something.
Thought experiment: Would you visit your mother or father in the hospital knowing they were going to die that day? I mean there's nothing you can do, why bother??
It's not about writing off the injured due to their low odds of survival, its about your willingness to lower those odds for your other loved ones, or yourself. How does your thought experiment change when caring for your mother/father means your children might starve?
Ok but for every person who tries to save a stranger from drowning how many other people choose not to? Probably not 0. If I saw a stranger drowning and they were larger than child-sized I probably wouldn't attempt it- apparently its pretty common for the drowning person to panic and use their savior as a raft, drowning them in the process
Why do volunteer firefighters rush into a burning building to try to save children from some family they have never met before? Every day we afforded examples of people sacrificing their personal interests for the benefit of others.
But also, biologists usually use a definition of "altruism" that does not include close kin. Richard Dawkins was explicit about this in his 1976 book "The Selfish Gene." Helping someone you are directly related to is not considered altruism.
It's literally a skill issue. The correct way to help a drowning person is to get behind them and then hook your weaker arm around their neck & head while doing backstroke with the other. Having them on their back facing up (and out of the water) dispels the panic reflex. But this obviously requires you to be comfortable int he water and have some prior rescue training.
I think in the premodern era, you never saw strangers (not like we do). You probably had a pretty good idea who everyone was, and probably knew most people pretty well. If that's even partially true, then although nowadays you might drive past a person on the highway, if your cousin or a lifelong trusted acquaintance asked for help you'd give it. It seems that everyone you saw, esp saw injured or sick, was probably someone you've known your whole life.
You're also heavily discounting the fact that you had to live not only with yourself if you did nothing, but the shame/angst of their family who you definitely lived next door to. TFA is about taking care of "their own", not strangers.
Good way to look at it. More broadly, there must have been different groups that practiced different policies with regard to ill and injured. Some of the groups fared better than others. Since most of modern societies do care about their ill and injured, it appears that this policy proved more advantageous. Even if only slightly so.
This is the right question to ask. You can reason your way around things, but occam's razor reigns supreme. Injured people can still do lots of work, as our most important tools were our brains, not our bodies. It's not hard to watch for predators near camp while sitting at the campfire, or to keep an eye on children - even if you can't resolve issues yourself. You could sit around making crafts for the tribe, repairing clothes, and more.
There's just way too much benefit to keeping the injured around. We don't need everyone working at top physical condition... ever.
Without knowing what happened, it's difficult to make the comparison between the Italian Years of Lead and what happened earlier today at Utah Valley University.
My understanding of the Italian political climate of the 60s, 70s, and 80s is that there were political groups/cells (on both the far right and far left) that organized around violent acts to further their political goals (which involved the eventual authoritarian takeover of the Italian government by either the far right or far left). For example, you can think of the Red Brigades to be akin to the Black Panthers, but with actual terrorism.
In contrast, most political violence in America has been less organized and more individual-driven (e.g., see the Oklahoma City Bombing). For better or worse, the police state in the US has been quite successful in addressing and dispersing political groups that advocate for violence as a viable means for societal change.
This was an intentional adoption of leaderless resistance[0] in response to the vulnerabilities in centrally administered organisations of the 60-80s.
Resistance orgs across the ideological spectrum were systematically dismantled after decades of violence because their hierarchical command structures made them vulnerable to infiltration, decapitation and RICO-style prosecutions.
The Weather Underground, Red Army Faction, European Fascist groups and many white supremacist groups all fell to the same structural weaknesses.
Lessons were codified by the KKK and Aryan Nations movements in the USA in the early 90s by Louis Beam[1] who wrote about distributed organisational models.
This was so successful it cross-pollinated to other groups globally. Other movements adopted variations of this structure, from modern far-right and far-left groups to jihadist organisations[2]
This is probably the most significant adaptation in ideological warfare since guerilla doctorine. There has been a large-scale failure in adapting to it.
The internet and social media have just accelerated its effectiveness.
"Inspired by" vs "carried out by" ideological violence today is the norm.
The KKK has been a distributed movement from the beginning, though, starting as isolated remnants of Confederate forces acting as terrorist cells in tandem with local officials and businessmen (e.g., plantation owners), and resurgent in the 20s and 30s (obviously sans the direct Confederate connections, replaced with local law enforcement).
It's not so much that we haven't been able to adapt to it as we've simply refrained from doing so. Their violence was in line with the interests of local elites.
Timothy McVeigh got his start watching Waco burn, hanging out with groups around the US "militia movement", and reading The Turner Diaries, and had like 3 accomplices.
Actually, it has been proven that at least two of the major terrorist attacks that happened in Italy during the lead years were actually false-flags attacks organized by a deviated part of the secret services (that were politically aligned with the far right), funded and supported by the US, in order to isolate politically the Brigate Rosse movement and stop any advance of communism in Italy.
This was my take as well. At least microeconomics has moved away from large-scale observational studies and has moved into experimental and quasi-experimental studies.
While the methods alone cannot fix it all ("You can’t fix by analysis what you bungled by design" [1] after all), it gets somewhat closer to unbiased results.
reply