Skip to content

Setting up a test Grafana

Jerry Lundström edited this page Mar 4, 2024 · 48 revisions

Nov 2022: Update DONE

  • all instructions on this page is DONE
  • test site example dashboards exported from Grafana v9.2.3
  • Nov 9, fixed dsc-datatool example run - influx data was written to file rather then stdout

please README !

This is meant as a learning example

The setup and graphs describe in this Wiki are meant as a example of what can be done. It's important to understand and learn how DSC's datasets work, how they are exported to InfluxDB and how graphs in Grafana works.

Grafana is an amazing tool, and you will greatly benefit in customizing your own graphs for your own purpose and needs.

InfluxDB's Retention/Downsampling

This subject will not be covered here, please refer to InfluxDB documentation in order to set it up in the way you want it.


Installation

For these examples Debian 11.5, InfluxDB 2.5.0, Grafana 9.2.3 and dsc-datatool 1.2.0 was used.

See each software's documentation for other distributions:

When you have a Debian up and running do the following to setup all repositories needed:

sudo apt install wget gnupg apt-transport-https

wget -q https://repos.influxdata.com/influxdata-archive_compat.key
echo '393e8779c89ac8d958f81f942f9ad7fb82a25e133faddaf92e15b16e6ac9ce4c influxdata-archive_compat.key' | sha256sum -c && cat influxdata-archive_compat.key | gpg --dearmor | sudo tee /etc/apt/trusted.gpg.d/influxdata-archive_compat.gpg > /dev/null
echo 'deb [signed-by=/etc/apt/trusted.gpg.d/influxdata-archive_compat.gpg] https://repos.influxdata.com/debian stable main' | sudo tee /etc/apt/sources.list.d/influxdata.list

sudo wget -q -O /usr/share/keyrings/grafana.key https://packages.grafana.com/gpg.key
echo "deb [signed-by=/usr/share/keyrings/grafana.key] https://packages.grafana.com/oss/deb stable main" | sudo tee -a /etc/apt/sources.list.d/grafana.list

wget -O - https://pkg.dns-oarc.net/dns-oarc.distribution.key.gpg | sudo tee /etc/apt/keyrings/pkg.dns-oarc.net.asc
echo "deb [signed-by=/etc/apt/keyrings/pkg.dns-oarc.net.asc] http://pkg.dns-oarc.net/stable/`lsb_release -c -s` `lsb_release -c -s` main" | sudo tee /etc/apt/sources.list.d/dns-oarc.list

Now we can install all software needed:

sudo apt update
sudo apt install influxdb2 grafana dsc-datatool

For some reason InfluxDB does not automatically start on installation and Grafana is disabled, so we fix that to make sure it automatically starts on next boot:

sudo systemctl daemon-reload
sudo systemctl enable grafana-server
sudo systemctl start influxdb grafana-server

Grafana setup

Once Grafana has started up it's good if you log-in and change the default admin/admin password. Grafana will default listen to http://$IP:3000/.

Proxy setup (optional)

If you want to run Grafana behind a proxy you need to configure the public URL:

sudo vi /etc/grafana/grafana.ini 

Find and change the following parameters to match your environment:

root_url = %(protocol)s://%(domain)s:%(http_port)s/$subpath/
serve_from_sub_path = true

And restart:

sudo systemctl restart grafana-server.service

InfluxDB setup

We now need to do the initial setup of InfluxDB and then create mapping for InfluxQL along with a read-only token for Grafana.

First we do the setup, remember to change the $password:

influx setup -u dsc -p $password -o dsc -b dsc -f

Now we create the mapping:

influx v1 dbrp create --db dsc --rp autogen --bucket-id `influx bucket list -n dsc --hide-headers|cut -f 1`

And then the read-only token, note it down for use in next section:

influx auth create --org dsc --read-bucket `influx bucket list -n dsc --hide-headers|cut -f 1`

Grafana Data Source

Now lets create an InfluxDB Data Source, in Grafana as admin select configuration, data sources, "Add data source" and select InfluxDB.

In HTTP setting you need to specify the URL to InfluxDB (yes, even if the gray says the same):

  • URL: http://localhost:8086/

Then "Add header" under Custom HTTP Headers, replace $token with the token from the previous step - make sure to include Token .

  • Header: Authorization
  • Value: Token $token

Next we configure the InfluxDB Details:

  • Database: dsc
  • HTTP Method: POST
  • Min time interval: 1m

This minimum time interval setting needs to be the same as the DSC interval for writing data.

Save & test to check that it is working.

!!NOTE!! Grafana checks if the connection is working by getting a list of measurements but there is likely nothing in the database yet so it will always show error connecting influxDB influxQL. You can verify connection by looking in the syslog as InfluxDB will by default log commands - so if you see SHOW MEASUREMENTS ON dsc with no apparent errors after then all is working.

Example Grafana Dashboards

There are a few example Grafana dashboards that you can import:

  • Download the dashboard JSON files here
  • Under menu Dashboards, select + Import
  • Use Upload JSON file to add the JSON file(s) and import it
  • Repeat from step 2 until all JSON files have been uploaded

DSC XML data

The instructions below will need access to the XML data from DSC, it is up to you how to make that available.

Fetch Labler YAML (optional)

The IANA DNS parameters can be used with the Labler transformer to rewrite RCODE, QTYPE and OPCODE numbers into their names.

For it to work the Labler needs a YAML file with the label names and you can run the following contributed script in order to download them from IANA:

wget https://github.com/DNS-OARC/dsc-datatool/raw/develop/contrib/iana-dns-params-toyaml.py
python3 iana-dns-params-toyaml.py > $HOME/labler.yaml

NOTE: You should make a routine for updating this YAML file from time to time, it does happen (not often) that new QTYPE/RCODEs gets added.

MaxMind Database Setup

To use the client_subnet_country generator it will need the databases from MaxMind. These databases are no longer distributed because of privacy regulations but are still available for free, see https://dev.maxmind.com/geoip/geoip2/geolite2/.

Once you have your access and geoip.cfg use geoipupdate to fetch the databases, see https://github.com/maxmind/geoipupdate for it's installation instructions.

sudo geoipupdate -f geoip.cfg

Import!

We can now run dsc-datatool and import some DATA!

Following environment variables are used in the shell code below:

  • $SERVER: Used to set the server name (--server)
  • $NODE: Used to set the node name (--node)
  • $XML: Used as the input XML file/directory (--xml)
dsc-datatool \
  --server "$SERVER" \
  --node "$NODE" \
  --output ";InfluxDB;dml=1;database=dsc" \
  --transform ";Labler;*;yaml=$HOME/labler.yaml" \
  --transform ";ReRanger;rcode_vs_replylen;range=/64;pad_to=5" \
  --transform ";ReRanger;qtype_vs_qnamelen;range=/16;pad_to=3" \
  --transform ";ReRanger;client_port_range;key=low;range=/2048;pad_to=5" \
  --transform ";ReRanger;edns_bufsiz,priming_queries;key=low;range=/512;pad_to=5;allow_invalid_keys=1" \
  --transform ";ReRanger;priming_responses;key=low;range=/128;pad_to=4" \
  --transform ";NetRemap;client_subnet,client_subnet2,client_addr_vs_rcode,ipv6_rsn_abusers;net=8" \
  --generator client_subnet_country \
  --generator ";client_subnet_authority;fetch=yes" \
  --xml "$XML" | influx write -b dsc -o dsc --format=lp

!!NOTE!! Using fetch=yes for client_subnet_authority executes a couple of HTTP GETs on every run! If you use this frequently then it's recommended that you download the files to local storage regularly and point to them instead. See man dsc-datatool-generator client_subnet_authority.

View the Dashboard

You should now have data in your dashboard at http://$IP:3000/.

Questions?

If you have any questions about any of this then you can find me on OARC's Mattermost in OARC Software channel. Just @jelu <question> me :)