If I was hiring an agency to build a mobile app, and one of them came back with "195 days," I would have pretty high confidence that they've never built anything of noteworthy complexity before, and that they think that software projects can be planned to the day months in advance.
On the other hand, for operational promises (e.g., a 24-hour SLA) means something more specific than a 1-day SLA: 24 hours is clear, whereas anything based on days requires definitions (Literally the next day? After one full calendar day has passed?
So, no, smaller units are not more credible... smaller units make a more specific claim which in some cases is more credible, but in some cases is not credible.
365 days is just as appropriate as 1.00 years, assuming all your calculations involve measurements with three degrees of accuracy. If they don't, 365 days could be considered a misleading measurement, as opposed to saying one year.
They're a rough approximation compared to carrying through both accuracy and precision information and doing a full error analysis. Most of the time they're good enough, but sometimes you really do need to know that. EG some widget is 56mm +3µm/-500µm, that's very different from just saying it's 56mm. Significant figures ignore the actual error distribution.
Since I made the point, I'll also argue that it's often not relevant. Most of the time the particular error distribution doesn't matter, and significant figures are a good enough approximation. The biggest issue most people have with them is that they're first taught in high-school chemistry class, and they're not taught error analysis first. Significant figures are then seen as difficult complications compared to just copying down what your calculator outputs, instead of a simplified approximation compared to the work needed to do a full error analysis. Sig figs thus get seen as something difficult and harder than the default, rather than as an easier shortcut than the default.
Also most calculators and unit converters don't take them into account by default (or at all) since someone might actually have accurate tolerance information to do a full analysis.
FWIW in my field of web performance I'll follow, and ask others to follow, simplified significant figure rules. For instance I'll see people copy down +1.5785% +- 0.10345% improvement. In which case +1.6% +- 0.1% is a better presentation.
In think the CS industry and the tooling like you mention should do a better job here.
AT&T cellular lists a user's data usage in kilobytes rather than gigabytes on the account page to obfuscate their poor business practices. Paradigm of credulity.
Silly. If a bunch of people start using smaller units in order to make their statements "seem" more credible, such statements won't seem that credible after a while. But using smaller units than what the listener would already set off his sense of bullshit if that degree of precision is unwarranted. If it doesn't, congratulations for successfully preying on the gullible and the ignorant.
Speaking strictly for myself... this is basically why, despite having grown up in a metric country and technically living in one (Canada) I still sometimes use inch, foot, ounce and sometimes even gallons for rough estimates. If the unit fits the quantity, the uncertainty is implied as in "he was about six feet tall" or "add a couple of inches".
Related to this, but I'm working on a little project where people can hire a server for a specific game (RRRE). In that context, "Buy 2 extra hours of server time" sounds a lot better than "Buy 120 minutes of server time".
On the other hand, for operational promises (e.g., a 24-hour SLA) means something more specific than a 1-day SLA: 24 hours is clear, whereas anything based on days requires definitions (Literally the next day? After one full calendar day has passed?
So, no, smaller units are not more credible... smaller units make a more specific claim which in some cases is more credible, but in some cases is not credible.