Wouldn't that imply that >80% of all monitored telnet sessions were exploit attempts for the specific CVE in question? Even with the scale of modern botnets, that seems unrealistic for a single vuln that was undisclosed at the time.
In no particular order: 404 Media, Ars Technica, BleepingComputer, The Register, The Verge, and Tomshardware.
These usually sit in the corner of my screen through the day. Some are better than others for work purposes. The Verge could probably go, and 404 is a bit more socially-focused than the rest. In particular though, having rapid updates from BleepingComputer and El Reg is a great way for me to learn about new vulns, issues that might affect my users, etc.
Seconding this. I have one for my work desk, where (surprisingly enough) it made a lot of sense. The DPI isn't as big of an issue as people make it out to be if your workflow doesn't depend on high density, but the curvature definitely could benefit from being a bit tighter. You need a fairly deep desk or a keyboard tray if you don't want to be turning your head a bunch.
That being said, having this in combination with PowerToys FancyZones has been fantastic. At any given time, I'm usually running 1-4 main working windows plus Signal, Outlook, and an RSS reader. This gives me more than enough real estate to keep them all available at a moment's notice. I have roughly 40% of the screen real estate dedicated to Signal, Outlook, and my RSS client, with the interior 60% being hotkey-mapped to divide in different proportions. Compared to my old setup (one ultrawide plus two verticals) it's been awesome.
Just a nitpick: analog NTSC was roughly 480i at (just under) 30 FPS. The latter is significant, as 3:2 pulldown (as would have been necessary if the station's cameras were scanning at 24 FPS) would have introduced judder and made tracking even harder. To its credit, interlacing also improved motion clarity at the expense of loss of detail, but whether that's a net benefit ultimately amounts to a matter of preference.
Just a guess, but Google Maps has (had?) an integration with Spotify that adds basic playback controls below the map view during navigation. It's meant to keep you from needing to switch apps while driving.
Just my $0.02 as a net/sysadmin for a small municipality in the US:
A big part of why we haven't been able/bothered to migrate to a proper .gov domain boils down to the amount of technical debt we'd need to pay back in the process of doing so. Everything that we do uses our non-.gov domain, namely our Office 365 connectors. On top of that, end users' day-to-day communications with the public make use of the existing domain. Modifying that in any capacity could prove disruptive to ongoing communications and potentially render them liable for dropping the ball somewhere. Not to mention that every single internet account ever created by staff using the current domain would need to be migrated or risk being lost forever.
Additionally, we're a small team. Only myself and one other individual would really have the technical knowledge to migrate our infrastructure. The opportunity cost involved would be massive. There are grants available to help us with this, but obtaining/using those can get complicated at times.
Ultimately, the pros just don't outweigh the cons enough to make a huge difference. From a purely academic angle, should we have a .gov TLD? Absolutely. In practice though, the residents and staff are familiar enough with the current one to render it a non-issue. The average non-technical user doesn't "see" "[municipality].[state].gov". They aren't familiar with the concept of a domain hierarchy at all. They just memorize "[municipality_website]" and move on with their day.
> They just memorize "[municipality_website]" and move on with their day.
I haven't even done that much, I couldn't tell you offhand the URL for my county government. I always just search in Google, which takes me right to the page I need (roads, solid waste, library, etc.)
> The average non-technical user doesn't "see" "[municipality].[state].gov". They aren't familiar with the concept of a domain hierarchy at all. They just memorize "[municipality_website]" and move on with their day.
That mean they can easily be redirected to a phishing website.
Absolutely, and that's a risk that we carry, especially in the public sector. That being said though, I don't know if adopting a better-regulated domain is itself enough to alleviate that.
The very unfortunate reality is that many (most?) users evaluate phishing attempts with the null hypothesis that "this is trustworthy". They are looking for evidence that something is wrong and assuming all is well if they don't find it. To that sort of user, the thinking goes something like:
* Some trustworthy sites use .com.
* My municipality is trustworthy.
* My municipality uses .com.
If you draw out the venn diagram, there's a clear gap in that line of thinking. That doesn't matter to someone's Great Aunt Linda though. She just knows that .com is what goes after Amazon and Google, so it must be good.
With that in mind, could using .gov help to protect those folks? To a certain extent. I can see the argument for keeping the more discerning few from getting scammed. For the broader group though, it won't change anything.
Offhand, the alternative solution that I'd offer would be providing clear communication standards to the public. Specifically, defining when, how, and from whom municipal notifications go out. Think of it like the IRS only sending physical letters; archaic as it seems, it makes it pretty obvious that an email "from them" is bogus. The clearer someone's understanding of where to find us is, the more optimistic I am that they'll get where they need to be.
Nah, even worse, they type “municipality” or some butchered typo of it into their browser, triggering a Google search, and click the very first link they see (sponsored or no) - so they can wildly easily be tricked into phishing websites.
Arguably we’re all victims of the decade or so when Google was so good at serving up the right site, so most people just got used to not knowing any URLs. People Google “YouTube” or “cnn” rather than type even the .com after those words.
IMO, poor website UX plays a big part in this too. People are far less likely to Google "[city] public works" if "public works" is a top-level menu item on the city website. When you first need to click a hamburger menu, hover over the "departments" entry, select "other departments", and then pick "public works" from the site header though, Joe Public is just going to do a search.
Yes, what really makes people like us cry is watching someone type in just the word Google into the ubiquitous search/URL bar, hit enter, click Google’s first result for Google which is google.com, then type “cnn.com” into the search field, hit enter, and then click an ad or result for CNN.
You say there are grants available, but given the current environment actually relying on those seems risky - even if you were actually to get the money up front it seems like it might get clawed back.
You are correct. This is a consideration at all levels of government currently, with faith in those grants' persistence varying based on an individual recipient's responsibilities.
> The average non-technical user doesn't "see" "[municipality].[state].gov". They aren't familiar with the concept of a domain hierarchy at all. They just memorize "[municipality_website]" and move on with their day.
You've just highlighted the problem. This is something every single human being in America should know, and arguably almost the entire world.
This falls directly under the rubric of Basic Computing Knowledge > Basic Internet Knowledge.
Every single time I see someone searching for "microsoft" or "apple" I immediately stop them and tell them, "You've already done most of the work. Microsoft and Apple are commercial entities. Add .com at the end, which is what .com means. Commercial. You're adding extra work for yourself."
Yes, a few people pop off at the mouth at which point I remind them ignorance is of a thing is easily remedied with a little give-a-damn, and saves everyone time and money.
Talk about a fucking miserable failure of education. I'm 44. I expected the generation 20 years younger than me to be impossibly skilled with computers to the point that I wouldn't hope to even match them, much less surpass them. Instead what we got was a world where we dumbed every goddamn thing down so even the most drooling moron can utilize it.
They should know the basic principles! For the same reason they should know what a noun and a verb is. For the same reason they should know that you can multiply something by 10 by adding a zero. When so much of our lives revolve around the Internet, basic literacy about its fundamental mechanics makes a lot of sense. The alternative is the world we live in now, where it’s trivially easy to scam people because they believe www.irs.gov.login.html.b3293.cn/login is functionally equivalent to www.irs.gov/login.html?b3293.cn
Imagine if people were this bad at counting, or at knowing the difference between US currency and monopoly money.
> so much of our lives revolve around the Internet
This was my core point, that this is true for you but is not actually true for everyone. To claim the entire world needs to know this when people get by just fine every day without being online or being on a device is absurd to me.
I wasn’t only talking about nerds. There are not a lot of people anymore who are not impacted by the Internet and who don’t usually use it.
And they don’t get by just fine every day.
People get phished and scammed constantly, in many ways that could be prevented if people had and remembered like a 2-week unit in high school on how the Internet works.
I’m not saying they need to understand even the fact that DNS converts names to IP numbers. Merely that it’s a hierarchy and how to trace responsibility (originating from the right side).
That’s no more difficult to grasp (if taught properly) than how to read the address on an envelope and understanding that “San Francisco, California” means a city in San Francisco located in the state of California.
Other lessons in the unit would include how email works including its lack of guarantees of authenticity. And finally, what encryption means and applying that knowledge to safe and unsafe ways of storing and transmitting information.
> The time to do something to standout is while you are working.
This is an issue I ran into recently during my post-undergrad job hunt. Having exited college without an internship, it was difficult to distinguish myself in any meaningful way. In my opinion, major, career-defining work needs to be at least six months' worth of dedication to be of any importance on a resume. Most people don't have the savings to go that long between jobs.
I was fortunate enough to secure a well-paying internship over the next six months, but in all honesty I think I got lucky. It's tough out there if you don't have the existing background to set yourself apart.
While I have the chance here, I want to say thanks to you and your team for the amazing work that you all are doing. I doubt you need further validation, but believe me when I say that the ideas you're describing do work. My entire career in CS started with Scratch in intermediate school (somewhere between 2010 and 2012). Having an interface with a low barrier to entry, particularly for someone whose economic situation didn't allow for engagement with more sophisticated tools, allowed me to begin engaging with computing in ways that I'm not sure I would've been able to otherwise. It was also a bonding experience for my peer group and provided me with a shared interest to meet people over. At the precipice of graduating with a bachelor's in CS, I've been reflecting a lot on how I get here, and Scratch certainly played no small part in that process.
Would you be willing to elaborate on what you've seen a bit? I'm on the last semester of my CS degree now and have definitely seen a lot of similar effects as a result of Zoom courses and general isolation during COVID lockdowns. There's generally less willingness to reach out to people than there used to be, and people seem to prefer dividing up tasks and working independently over collaborative work (e.g. each person in a group project having a "role" rather than working jointly on a large segment). There's also a general preference towards working at home without any external interaction whatsoever and a lack of willingness to form study groups. As I start moving into the job application phase of things, I can definitely see how these traits can be seen as off-putting to hiring managers, and I'd like to avoid falling into similar traps. Is there anything else you've noticed that would be worth avoiding?
reply