shell脚本-Nginx访问日志分析
shell脚本-Nginx访问日志分析1.原理
可以通过/usr/local/nginx/logs/access.log 文件-查看nginx的日志
# tail -f /usr/local/nginx/logs/access.log
192.168.70.1 - "GET / HTTP/1.1" 200 173833 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.82" "-"
192.168.70.1 - "GET / HTTP/1.1" 200 173833 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.82" "-"/usr/local/nginx/conf/nginx.conf 文件-定义了日志输出的格式
可以通过awk命令来取出所需要的数据
# awk '{print $0}' /usr/local/nginx/logs/access.log
192.168.70.1 - "GET / HTTP/1.1" 200 173833 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.82" "-"
192.168.70.1 - "GET / HTTP/1.1" 200 173833 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.82" "-"
# awk '{print $1}' /usr/local/nginx/logs/access.log
192.168.70.1
192.168.70.12.shell脚本
Nginx访问日志分析脚本
#!/bin/bash
# 1.访问最多的IP
# 2.根据时间段来访问最多的IP
# 3.访问量超过2次的页面
# 4.访问页面状态码数量
LOG_FILE=$1
echo "统计访问最多的10个IP"
awk '{a[$1]++}END{print "UV:",length(a);for(v in a)print a v}' $LOG_FILE |sort -k1 -nr | head -10
echo "----------------------------------------"
echo "统计一个时间段访问最多的10个IP"
awk '$3>="++}END{print "UV:",length(a);for(v in a)print a v}' $LOG_FILE |sort -k1 -nr | head -10
echo "----------------------------------------"
echo "统计访问量超过2次的页面"
awk '{a[$7]++}END{print "PV:",length(a);for(v in a){if(a>2)print a,v}}' $LOG_FILE | sort -k1 -nr
echo "----------------------------------------"
echo "统计访问页面状态码数量"
awk '{a[$7" "$8]++}END{for(v in a)print a,v}' $LOG_FILE | sort -k1 -nr# bash 13.sh /usr/local/nginx/logs/access.log
统计访问最多的10个IP
2192.168.70.1
UV: 1
----------------------------------------
统计一个时间段访问最多的10个IP
1192.168.70.1
UV: 1
----------------------------------------
统计访问量超过2次的页面
PV: 1
----------------------------------------
统计访问页面状态码数量
2 HTTP/1.1" 200
来源:https://www.cnblogs.com/xuxuxuxuxu/p/17561246.html
免责声明:由于采集信息均来自互联网,如果侵犯了您的权益,请联系我们【E-Mail:cb@itdo.tech】 我们会及时删除侵权内容,谢谢合作!
页:
[1]