The longer the manager is out of the game, the harder it is to return to the game. Returning to the game takes time. Depending on age and income, returning to the game may be impossible for some people over time.
Why were these sharp edges not discovered in UAT? It seems hard to believe that the people who make Photoshop do not use Photoshop to the degree necessary to notice these regressions. How will you avoid these kinds of problems in future updates as your transition to a modern design continues?
In a word: they were. We do use Photoshop (though not to the level or extent of most users) and noticed the regressions. Shipping software is in perennial tension between getting it perfect and getting it out the door.
Going forward, we would like them fixed, too. Personally my hope is the message from user feedback like this is heard loud and clear, and we respond appropriately.
I'm far from an adobe fan, but I feel the need to defend them just a little here.
Everyone with non-trivial software has to do this to some extent. Perfection just isn't possible. The real measure is in where the company finds the balance. I think Adobe needs to tilt toward the perfection a bit more, but this is not something that people can do unilaterally without buy in from the very top of the chain, which I'm guessing GP is not.
Keep in mind that photoshop isn't the near-monopoly that many people think it is, especially in light of generative AI. If they take too long to ship features, it will similarly be criticized by paying customers who feel Photoshop is hobbled.
They said they were principal scientist for this change and voluntarily took partial responsibility. I think if you look at this in a greater context of things were perfectly fine for decades and then they broke then I'm not sure how it's at all defensible. Just looking at the old vs new modals, there aren't even really new features. It's just breaking it
Yeah, but there’s a world of difference between “not perfect” and “we rounded a few sliders, and now the modal is a nightmare to use”.
Rolling out a new UI for such a staple piece of software is smart. But doing it the way they are doing is absurd. Why even release it when you have basically nothing to show besides a broken modal, rounded sliders and a couple things made thicker? That’s not being mindful, that’s just someone’s unfinished staging build that got pushed to production by mistake. It’s insane that they’re doing this to Photoshop (of all apps). And honestly, quite insaner that anyone would defend anything from Adobe after all the crap they’ve pulled (and continue to pull) over the years.
They are wrong. They are going about it the wrong way. And paying customers deserve a hell of a lot more. Adobe OWES us a better treatment. Big time.
And unfortunately Photoshop very much is the monopoly many think it is. Those complaining about Photoshop being hobbled because it can’t hallucinate AI slop are not Adobe’s target audience and main source of income. Adobe’s only as crap as it is now exactly because it knows it holds basically the entire graphic design industry in a stranglehold. No other apps are currently even close to Adobe’s in terms of compatibility, functionality and support—unfortunately. I wish someone would come and claim Adobe’s crown, but that is simply not happening.
> Shipping software is in perennial tension between getting it perfect and getting it out the door.
First do no harm. Changing functionality that works is not in tension with getting regressions out the door. Assure it is working before shipping by hiring testers that use the product to the level or extent of most users.
> We do use Photoshop (though not to the level or extent of most users) and noticed the regressions.
Is there something you want to tell us about management? This is crazy, if what you mean is you know you broke this for power users but shipped it anyways, or that you don't have power-users on payroll to constantly test your product that you can call "part of the team".
Disclosing my bias up front: I think Adobe is an evil company and I actively avoid them. This is not personal against Adobe employees however. I know there are a lot of people who want things to be better and work their asses off toward that goal.
Indeed, I don't think most people can appreciate how hard the tension is between shipping and perfection. As a fellow perfectionist, it kills me to ship things that I know aren't perfect, but I've had to work on becoming more of a pragmatist because if I had my perfectionist way, shipping would take years and feedback loops would be so long that it would be somewhat self defeating (though that's a personal problem). I appreciate you taking the time to respond here, even knowing you'll catch some heat.
We are not talking about perfection. We are talking about breaking a stable piece of software and affecting people's muscle memory with minimal upside to users. People provide for their families with Photoshop. It is unacceptable to push a change that impacts millions of people and then throw your hands up in the air and claim that this is all inevitable because perfection is impossible.
If this was a startup or new software finding a market fit it would be different. This is industry standard, professional software that impacts livelihoods. More thought should go into each release because of this fact.
> Shipping software is in perennial tension between getting it perfect and getting it out the door.
Photoshop is the premiere image editor that has been in existence for decades. The issues you are responding to are fundamental changes to how the application behaves. It defies belief that your team and its processes have this little respect for dedicated users who have spent thousands of dollars on your product over the course of years. I understand shipping software. Do you understand your users?
> Why were these sharp edges not discovered in UAT?
These kinds of sharp edges should *never* have made it as far as UAT. All of these should have been caught in the first prototype and never made it beyond that point.
The fact that they made it all the way to the shipping product shows that too many responsible parties were asleep at the switch.
Obviously they should have a few power users on payroll that find these obvious regressions quickly, and we can call them part of the team who make Photoshop. I'm not sure why this, and what the lead scientist said is valid justification. Just hire "people that use Photoshop". If they already do this, then the people that make Photoshop use Photoshop to a sufficient degree.
But moreover, if one has developed Photoshop for 15 years, I'm pretty sure they are aware of power user table-stakes features.
And then one more point:
> Why?
Because that's what it takes to develop high quality software tools. This shouldn't even be up for debate when charging money for software.
They actually have a very slick and very active beta program. I use the betas 99% of the time, and they are practically weekly updated. I'm surprised something like this wasn't reported en masse very quickly. Maybe it's just not annoying enough -- it doesn't reach the threshold for someone to file an issue. I know it's the sort of regression where I would huff and puff and get on with my day.
I think the constructive criticism is best directed at whatever process you are following. That process allowed a very visible user facing change in a widely used piece of software. How did this change make it to production without some process catching the impact of this change? Was there really no internal discussion from a code review at least? This seems hard for me to believe. I expect more from Microsoft.
> Was there really no internal discussion from a code review at least? This seems hard for me to believe.
The outlined story feels unfortunately very believable to me.
Teams need to push out the most number of features, and nobody stops even for a second to think about how a feature might affect other flows or other users not in the feature request.
It might have been quickly reviewed to check if the code does what it needs to do (add the coauthor note).
Do you think reviewers will think about unwanted effects, when they need get back to feeding their own poorly thought out and underspec’d features to their LLMs?
> Was there really no internal discussion from a code review at least? This seems hard for me to believe.
>The outlined story feels unfortunately very believable to me.
100% agree here - we seem to forget that most developers hate code reviews. I actually laughed out loud at the use of the word "discussion," it's so rare people want to get together and talk about changes. By the time the PR is up anything that stands in the way of merging and shipping is seen as a nuisance.
To my mind this whole debacle is not really the individuals fault or even the team's fault but the economic pressures that drive people into situations like this.
Fair point. We did catch it internally in testing (as we use VS Code for all our work, so some folks did stumble on it), but I think we underestimated the impact and should do a better job at that.
This is honestly the most concerning part of all of this. You're saying you knew that this exact bug was present up front and still decided to release it?
This basically invalidates the entire premise that it was an innocent mistake. It's impossible for me to believe that you actually thought that people wouldn't care about 100% of their commits being attributed to Copilot even when it was never used. Either you're misconstruing what you caught with the testing beforehand or your entire development process is tainted, because there's no way that a non-evil corporation would see this default behavior and think that people would be fine with it. It seems far more likely you just thought you could get away with it.
I think there is a "ship fast" component here that should be adjusted. Product Management introduced weekly "stable" releases in March, no matter the content.
I think so too, but my point is that even according to their own words about what happened, the best possible interpretation is that they didn't mean to do it but knowingly let it happen. I agree that a worse version is more likely, but it's pretty damning when even the ceiling for what they can plausibly claim is "we intentionally didn't bother stopping it once it happened accidentally".
A generous read of this comment might be that you did catch it internally in testing AFTER it shipped but shrugged it off as something you'd patch in the next release in a week or two. Is that what you meant here?
Or that it was caught but didn't surface fully before release?
A helpful governance policy here might be that anything that mutates user content without opt-in consent requires a distinct sign-off or a double sign-off. If the goal is to prevent this from happening in future.
I saw a lot of "they made a game I like (Halo), therefore they must not be that bad" from the gaming cloud that only experienced the console side of it
Also, who/what group is pushing for this change internally and what is the opinion of the team implementing it? What is the road map and vision for AI in VSCode?
I am a long time mac user and I agree with all of their points. I guess you disagree, but I am not sure why you are being dismissive. Each point is a legitimate criticism from many peoples' points of view.
I acknowledge the complaints, I love a good complaint! My issue is that these superficial, and in many cases, easily remediable annoyances add up to a "crappy OS". MacOS has to satisfy a very diverse userbase from Paris Hilton-types to grumpy Hacker News readers (but thankfully not Bank of America), and I think they do a better than decent job at it.
I don't consider the Mac's less-than-half-assed search facilities to be a superficial problem. I don't see how you can argue that a search that doesn't show WHERE it found hits is competent. Beyond that, it often just doesn't work. You can be sitting in a directory full of JPEGs and search for .jpg and get zero results. Zero.
And dismissing the asinine removal of the "get mail" button from Apple's default E-mail program because YOU don't happen to use it isn't exactly respectable, is it?
Mac OS DID satisfy a great many people; I've seen no credible (or even incredible) argument that the recent raft of faffing about with the UI has brought new users into the fold. That's the foundation of so many people's outrage over it: The changes offer no improvement and don't address any longstanding user requests. But it IS demonstrably regressive, and subjectively dated and tacky.
"Transparent" UI came and went 20 years ago for good reason.
I want to ask a dumb question: if it was known that this area was high traffic, why are archaeologists only just now discovering these wrecks? Is it not obvious to search this area for wrecks given its history? The article hints that climate change is increasing urgency. Is the case here that we knew there should be wrecks here, but climate change made the search happen?
I've actually had this conversation before with an archeologist with some naval archeology experience.
Shipwreck hunting is ridiculously expensive. The resources required to exhaustively explore 100 sqm of space is probably 1000x of the resources required to do it on land. There aren't any easy shortcuts: radar doesn't work underwater, sonar does but is extremely low resolution, lidar works pretty well but only if the water is very shallow and clear, underwater drones have extremely limited mobility and communication capability. A lot of funding in archeology tends to go to easier or higher probability wins, which has mostly been aerial lidar in heavy vegetation areas for the past 10-15 years.
The best shipwreck hunters rely almost entirely on probabilistic models for where they might find shipwrecks, and the most useful probabilistic models have all developed in the last 30-40 years. In fact, some of the best probabilistic models like Bayesian Search Theory actually originated as a formalization of heuristics that were already used in treasure/shipwreck hunting.
In that respect, I would argue that this find is actually the result of recent advances in probabilistic modeling (along with other advances in data engineering with respect to extremely messy historical data sources) that have just barely gotten accurate enough to start getting the funding it needs to do the harder work of actually working on the sea floor.
It's also worth remembering how little money goes into archaeology in general.
I can think of two nationally-significant archaeological sites in Central Europe - both were partially excavated about fifty years ago, to varying but fairly limited degrees, and then gently reburied, because there wasn't enough money to keep things going.
The site of one has a poorly-trafficked tourist centre today, the other is a clearing with nothing more than a tourist plaque. Both are likely candidates for previous capital cities, so they are obviously significant, but the money just isn't there to do anything about them. I seem to recall reading somewhere that over 90% of one of the sites remains unexcavated.
These are land sites, so relatively inexpensive compared to sea sites. If this is how willing we are to fund nationally-significant land digs, I imagine sea archaeology would be comparatively even more impossible to fund.
Yes exactly. I'll add that most shipwreck discoveries haven't actually been discovered by archeologists, they've been discovered either by amateur divers by accident, or by treasure hunters, who by default only seek after specific ships with known cargo. It's just too expensive for academic archeology organizations to pursue. Take away the quest for profit, and almost nothing gets discovered, no matter how historically significant.
I suppose this is an area where amateurs can help out. I live near the Great Lakes. Once in the while, amateur divers will discover a new shipwreck. It's like the way that amateur astronomers used to look for comets with the hope of being the first to report one.
You have to get the lidar down to the scan range, which they do with drones. The effective scan range of that particular lidar model is 1.5 to 15 meters. Compare that to 1000+m for aerial lidar. That means that they had to get the scanners to extremely low depths and were using very expensive drones, and the process was still extremely slow. They still had to target the search area using probabilistic models based off of available historical records, as a general search would have been way too expensive.
Another solution: train wild dolphins to recognize the goal (e.g. sunken ships), do the scan for you, and receive some compensation in exchange for the work they do (tasty food? play balls?). Should check the depth range of dolphins.
Somebody investing a zillion to hire people to train and feed dolphins most probably:
1) have enough money to buy robots instead and get rid of the legal and logistic trouble
2) would want to use the dolphins for activities that grant a better return of the investment like marine engineering or war (mining/demining).
Every major of a coastal city in California, or South-Africa (with a big beach visited by thousands of swimmers a day), would pay solid money for bay-watching and shark deterrent services that really work without the need of eyesore nets. People love to swim with dolphins too so would be another tourism resource in itself.
The time of your dolphins would be just too valuable and expensive to do Archaeology.
Definitely not an expert here, but I was always under the impression that dolphins can only be trained in captivity. If they aren't reliant on you for food, they have no need to perform for you.
SS Central America, sunk in 1857 with 14,000 kg of gold.."the old insurance companies who’d paid out when the original ship demanded, and were rewarded, in court a substantial amount of the gold recovered."
https://www.thenakedscientists.com/articles/interviews/how-f...
That's the main reason. Also marine archeology is expensive. I once heard an archeologist saying that if the rests have passed centuries underwater, one more is less harmful than looters.
Underwater sites are particularly harder to protect from looters than above / underground sites. If the stakes are high enough, scuba diving is a reasonable option for the criminally minded.
It wasn’t long before Costa Concordia was looted for its treasures.
Passengers possessions - e.g. jewelry, watches. Technical equipment on the ship. Items from the on-ship shops. Interesting artefacts (ships bells are often a prized loot from wrecks).
The punishment was made to be a deterrent for all who might consider doing the same as Alex Jones. You have made a straw man about destroying him forever.
You can get a lot of mileage out of making the indica/sativa distinction. Regardless of strain, inidca heavy tends to chill people out, while sativa heavy strands give energy and more anxiety. Everyone is different and ymmv.
Because it can do both. That may not be valuable to everyone, but it is a beneficial feature for many people. Also, Apple's keyboard case has a fantastic keyboard and trackpad that is a pleasure to type on.
That is because in 1943 Josiah Samuels wrote an influential book called, "Into the Fortnite" that depicted characters who were involved in a long, protracted battle. Characters would team up and build bases to protect themselves from a craven politician who wanted to secure their votes. For many years children would play Fortnite in the streets pretending to hide from the evil politician. Eventually, this game became quite popular to the point of achieving household ubiquity. A lot of older folks get confused and think this game was a video game!
reply