antipaucity

fighting the lack of good ideas

storage series

Some of the content is mildly dated, but this series of posts a few years ago is still something I refer to quite often:

determining the ‘legitimacy’/’reliability’ of a domain

I’ve recently been asked by several people to investigate websites (especially e-commerce ones) for reliability/legitimateness.

Thought someone else may find my process useful, and/or have some ideas on how to improve it ?

So here goes:

  1. Pop a terminal window (I’m on a Mac, so I open Terminal – feel free to use your terminal emulator of choice (on Windows, you’ll need to have the Subsystem for Linux or Cygwin installed))
    1. Type whois <domain.tld> | less
    2. Look at all of the following:
      • Creation (Creation Date: 2006-02-22T01:12:10Z)
      • Expiration (Registry Expiry Date: 2023-02-22T01:12:10Z)
      • Name server(s) (NS3.PAIRNIC.COM)
      • Registral URL (http://www.pairdomains.com)
      • Registrar (Pair Domains)
      • Contact info (should [generally] be anonymized in some manner)
    3. Possible flags:
      • If the domain’s under 2 years old, and/or the registration period is less than a year (we can talk about when short registrations may make sense in the comments)
      • If the name servers are “out of the country” (which, of course, will vary based on where you are)
      • If the contact info isn’t anonymized
  2. Load the website in question in a browser (use an ingonito and/or proxied tab, if you like) and review the following types of pages:
    • Contact Us
      • Where are they located?
      • Does the location stated match what you expect based on the whois response?
    • About Us
      • Does it read “naturally” in the language it purports to be written in?
        • Ie, does it sound like a native speaker wrote it, or does it sound stiltedly/mechanically translated?
    • Does it match what is in the whois record and the Contact Us page?
    • Do they provide social media links (Twitter, Facebook, LinkedIn, Instagram, etc)?
      • What do their social media presence(s) say about them?
    • Return/Refund Policy (for ecommerce sites only)
      • What is the return window?
      • How much will be charged to send it back and/or restorck it?
    • Shipping Policy (for ecommerce sites only)
      • How long from submitting an order to when it ships to when it arrives?
      • Where is it shipping from?
    • Privacy Policy (only applies if you may be sharing data with them (ecommerce, creating accounts, etc)
      • What do they claim they will (and will not) do with your private information?
  3. Is the site running over TLS/SSL?
    • You should see a little padlock icon in your browser’s address bar
    • Click that icon, and read what the browser reports about the SSL certificate used
    • Given that running over TLS is 100% free, there is absolutely NO reason for a site to NOT use SSL (double especially if they’re purporting to be an ecommerce site)

Reviewing these items usually takes me about 2-3 minutes.

It’s not foolproof (after all, better fools are invented every day), but it can give you a good overview and relative confidence level in the site in question.

3-month review

I’ve been running an M1-powered MacBook Pro since late April.

Here’s my experience so far: it Just Works™

That’s it

That’s the tweet

Want more? Sure!

Battery life is bonkers-awesome! I can run for over a full working day with my VDI client, YouTube, web browser sessions, ssh, several chat apps (including video and audio chat) sans being plugged in.

This is the laptop we’ve all wanted.

I half wish (or is it “wish”) I’d gone with the 5G-enabled, M1-powered iPad Pro instead … but this laptop is phenomenal.

There has been nothing I’ve wanted to do in the last 3 months that I could not do with this machine.

Kudos, Apple.

Kudos.

sometimes i’m off

It took Apple 5.5 (or 6, if you count last week as really hitting it) years to introduce what I called the MacBook Flex back in 2015.

With the 13″ MacBook Pro available in an M1-powered edition (which is so much better than the top-end MBP from 2019…it’s not even funny), and now a 5G-enabled iPad Pro running on the M1 … it’s here.

remembering sqrt

A couple weeks ago some folks in the splunk-usergroups.slack helped me using accum and calculating with a modulus to make a grid menu from a list.

My original search had been along the lines of:

| inputlookup mylookup
| stats count by type
| fields - count
| transpose
| fields - column

Which was great … until my list grew more than about 12 entries (and scrolling became a pain).

A couple folks here helped me flip it to this format:

| Inputlokup mylookup
| stats count by type
| eval num=1
| accum num
| eval num=num-1
| eval mod=floor(num/12)
| eval type.{mod}=type
| fields - mod num type count
| stats list(*) as *

Which works awesomely.

Unless the modulus value (currently 12) gets too small (if the total list grows to more than modval^2 .. each individual box is no longer in alphabetical order (and then alpha from box to box).

So I made this modification so that regardless of the size of the list, the grid will automanage itself:

| inputlookup mylookup
| stats count by type
| eventstats count as _tot
| eval modval=ceil(sqrt(_tot))
| eval num=1
| accum num
| eval num=num-1
| eval mod-floor(num/modval)
| eval type.{mod}=type
| fields - modval mod num type count
| stats list(*) as *

Dunno if that’ll help anyone else, but wanted to share-back that self-managing aspect I added in case anyone was interested :slightly_smiling_face:

4 places to test your internet connectivty

a poor user’s guide to accelerating data models in splunk

Data Models are one of the major underpinnings of Splunk’s power and flexibility.

They’re the only way to benefit from the powerful pivot command, for example.

They underlie Splunk Enterprise Security (probably the biggest “non-core” use of Splunk amongst all their customers).

Key to achieving peak performance from Splunk Data Models, though, is that they be “accelerated“.

Unfortunately (or, fortunately, if you’re administering the environment, and your users are mostly casually-experienced with Splunk), the ability to accelerate a Data Model is controlled by the extensive RBACs available in Splunk.

So what is a poor user to do if they want their Data Model to be faster (or even “complete”) when using it to power pivot tables, visualizations, etc?

This is something I’ve run into with customers who don’t want to give me higher-level permissions in their environment.

And it’s something you’re likely to run into – if you’re not a “privileged user”.

Let’s say you have a Data Model that’s looking at firewall logs (cisco ios syslog). Say you want to look at these logs going back over days or weeks, and display results in a pivot table.

If you’re in an environment like I was working in recently, where looking at even 100 hours (slightly over 4 days) worth of these events can take 6 or 8 or even 10 minutes to plow through before your pivot can start working (and, therefore, before the dashboard you’re trying to review is fully-loaded).

Oh!

One more thing.

That search that’s powering your Data Model? Sometimes (for unknown reasons (that I don’t have the time to fully ferret-out)), it will fail to return “complete” results (vs running it in Search).

So what is a poor user to do?

Here’s what I’ve done a few times.

I schedule the search to run every X often (maybe every 4 or 12 hours) via a scheduled Report.

And I have the search do an outputlookup to a CSV file.

Then in my Data Model, instead of running the “raw search”, I’ll do the following:

| inputlookup <name-of-generated-csv>

That’s it.

That’s my secret.

When your permissions won’t let you do “what you want” … pretend you’re Life in Ian Malcom‘s mind – find a way!