OH LOOK it is a critical feature for the release being specified AFTER the release
-
I'm so sick of this mickey mouse crap. Move slow and break as many things as possible, I guess.
-
Having cooled down a bit, I own more of this problem than I was initially giving credit for.
HOWEVER, the reason I own that part of the problem is because of previous hacks shoved in because of similar "yea whatever just slap something together and don't ever specify the data/behavior" reasons
It's just an exhausting way to develop software that is meant to be reliable for more than 2 weeks.
-
Well, I'm angry about this again. Some amount of shouting was involved this morning after I dared to try and be sure we actually understand what feature we're supposed to be implementing.
-
It took a solid 3 hours and reverse engineering the answer without the help of the authors of the upstream changes, but I finally understand the scope.
It doesn't have to be like this but boy does it seem to happen a lot :blobfox0_0:
-
Me: [filing a ticket for at least three months from now]
PM: how many days will that take?
-
@[email protected] I've never understood this demand that there be a realistic estimate of the time. Because sometimes knowing the scope is the problem, particularly for bug fixes that end up being 12 layers deep but a one line change. So it's like, well, in order to estimate how long it will take with any reasonable accuracy, I will have to do the thing.
I've had the good fortune to work with people who understand that and are just like, "just say a week or two" but. -
@aud the really frustrating thing is that in this case, I *can* reliably say that the ticket will take a day or two. It's mostly deleting code.
I just don't *want* to fall into a loop where the PM comments on every single ticket filed asking "how many days????" and I know that every time I take that unserious question seriously, I'm setting a precedent.
-
@[email protected] (total tangent, but "I don't know" is a phrase that does not seem to exist in software. Even in machine learning! I always wanted to train a model that would have "unknown" as an answer because _why wouldn't it_? But everyone likes to bake in the idea that a problem is self contained and complete and answerable and also that everything is knowable. It is not).
-
@[email protected] 100% "at least a whole sprint" then, totally.
-
@aud this is the only reason I work in the industry at all, $employer's secret sauce classifier has explicit support for uncertainty.
As in, not faking it by thresholding a softmax and pretending all-below-threshold is an N+1th class. It's pretty rad, actually.
-
@[email protected] now I'm just mad that no one listened at my job when I kept proposing we explore this back in 2019.
-
@aud the value of "I don't know" is wildly underappreciated even when the atoms of a system are human beings (cf. programmers/other SMEs almost never say these words) so it is depressingly understandable that ML as a whole is not very interested in it aside from parlor tricks