hexo博客优化
本地优化
文章链接唯一
Hexo默认使用的文章永久链接格式是:
year/:month/:day/:title/
缺点:
若标题为中文标题,则分享链接时会变成%X%X%X%之类的东西,非常难受.又臭又长,而且一旦更改标题和日期,链接会失效,非常不友好
解决方法:
采用 hexo-abbrlink
插件
-
在博客根目录(执行hexo命令的地方)安装插件:
npm install hexo-abbrlink --save
-
编辑站点配置文件
_config.yml
首先注释掉原先的
permalink
设置配置代码
# ======================================================= # # =====================abbrlink========================== # # ======================================================= # #permalink: :year/:month/:day/:title/ # permalink_defaults: # pretty_urls: # trailing_index: false # Set to false to remove trailing 'index.html' from permalinks # trailing_html: false # Set to false to remove trailing '.html' from permalinks permalink: posts/:abbrlink/ abbrlink: alg: crc32 #support crc16(default) and crc32 rep: hex #support dec(default) and hex drafts: false #(true)Process draft,(false)Do not process draft. false(default) # Generate categories from directory-tree # depth: the max_depth of directory-tree you want to generate, should > 0 auto_category: enable: true #true(default) depth: #3(default) over_write: false auto_title: false #enable auto title, it can auto fill the title by path auto_date: false #enable auto date, it can auto fill the date by time today force: false #enable force mode,in this mode, the plugin will ignore the cache, and calc the abbrlink for every post even it already had abbrlink.
项目链接:
-
重新部署
hexo clean && hexo g
重新部署后,会在
front-matter
字段中增加abbrlink
字段 -
验证查看 ,本地测试查看
hexo s -g
Gulp压缩资源
安装gulp及相关插件
可以直接修改 package.json
添加 gulp-replace
版本号以及gulp依赖
修改完成后, 在blog根目录下控制台 输入: npm i
重新部署package.json
以下是我的package.json
可供参考
package.json
{
"name": "hexo-site",
"version": "0.0.0",
"private": true,
"scripts": {
"rebuild": "hexo clean && hexo g && gulp",
"rebuildJsd": "hexo clean && hexo generate --config _github.yml && gulp two",
"deployGit": "hexo deploy --config _github.yml",
"deployGitHub": "hexo deploy --config _githubOnly.yml",
"deployOss": "hexo deploy --config _tencent.yml",
"start": "hexo clean && hexo g && gulp && hexo d",
"build": "hexo generate",
"clean": "hexo clean",
"server": "hexo server"
},
"hexo": {
"version": "5.4.0"
},
"dependencies": {
"aplayer": "^1.10.1",
"gulp-replace": "^1.1.3",
"hexo": "^5.0.0",
"hexo-abbrlink": "^2.2.1",
"hexo-autonofollow": "^1.0.1",
"hexo-deployer-git": "^3.0.0",
"hexo-generator-archive": "^1.0.0",
"hexo-generator-baidu-sitemap": "^0.1.9",
"hexo-generator-category": "^1.0.0",
"hexo-generator-index": "^2.0.0",
"hexo-generator-json-content": "^4.2.3",
"hexo-generator-search": "^2.4.3",
"hexo-generator-sitemap": "^2.1.0",
"hexo-generator-tag": "^1.0.0",
"hexo-renderer-ejs": "^1.0.0",
"hexo-renderer-markdown-it": "^5.0.0",
"hexo-renderer-stylus": "^2.0.0",
"hexo-server": "^2.0.0",
"hexo-theme-stellar": "^1.4.1",
"hexo-wordcount": "^6.0.1",
"markdown-it-emoji": "^2.0.0"
},
"devDependencies": {
"gulp": "^4.0.2",
"gulp-html-minifier-terser": "^6.0.1",
"gulp-htmlclean": "^2.7.22",
"gulp-htmlmin": "^5.0.1",
"gulp-minify-css": "^1.2.4",
"gulp-terser": "^2.0.1"
}
}
编辑gulpfile.js
blog根目录下添加gulpfile.js
文件
添加代码
- 其中 minify_html_jsd 对应的是使用cdn 重定向, 后面链接对应改成自己的静态文件仓库即可
- pipe(replace(xxx, xxx)) 是链接重定向, 跟根据需求自定义
gulpfile.js
var gulp = require('gulp');
var minifycss = require('gulp-minify-css');
var htmlmin = require('gulp-html-minifier-terser');
// var uglify = require('gulp-uglify');
var htmlclean = require('gulp-htmlclean');
var terser = require('gulp-terser');
var replace = require('gulp-replace');
// 压缩css文件
const minify_css = () => (
gulp.src(['./public/**/*.css'])
.pipe(minifycss({
compatibility: 'ie8'
}))
// .pipe(minifycss())
.pipe(gulp.dest('./public'))
);
// 压缩html文件
const minify_html = () => (
gulp.src(['./public/**/*.html','!./public/{lib,lib/**}','!./public/**.xml'])
.pipe(replace('src="https://cdn.jsdelivr.net/gh/sizaif/blog-cdn@main/js/', 'src="https://sizaif.com/js/'))
.pipe(replace('href="https://cdn.jsdelivr.net/gh/sizaif/blog-cdn@main/css/"', 'href="https://sizaif.com/css/'))
.pipe(replace('https://cdn.jsdelivr.net/gh/sizaif/blog-cdn@main/img/img_article', 'https://sizaif.com/img/img_article/'))
.pipe(htmlclean())
.pipe(htmlmin({
removeComments: true,
minifyJS: true,
minifyCSS: true,
minifyURLs: true,
}))
.pipe(gulp.dest('./public'))
)
const minify_html_jsd = () => (
gulp.src(['./public/**/*.html','!./public/{lib,lib/**}'])
.pipe(replace('src="https://cdn.jsdelivr.net/gh/sizaif/blog-cdn@main/js/', 'src="https://cdn.jsdelivr.net/gh/sizaif/blog-cdn@main/js/'))
.pipe(replace('href="https://cdn.jsdelivr.net/gh/sizaif/blog-cdn@main/css/"', 'href="https://cdn.jsdelivr.net/gh/sizaif/blog-cdn@main/css/"'))
.pipe(replace('https://cdn.jsdelivr.net/gh/sizaif/blog-cdn@main/img/img_article', 'https://cdn.jsdelivr.net/gh/sizaif/blog-cdn@main/img/img_article'))
.pipe(htmlclean())
.pipe(htmlmin({
removeComments: true,
minifyJS: true,
minifyCSS: true,
minifyURLs: true,
}))
.pipe(gulp.dest('./public'))
)
// 压缩js文件
const minify_js = () => (
gulp.src(['./public/**/*.js', '!./public/**/*.min.js','!./public/{lib,lib/**}'])
.pipe(terser())
// .pipe(uglify())
.pipe(gulp.dest('./public'))
)
module.exports = {
minify_html: minify_html,
minify_css: minify_css,
minify_js: minify_js,
minify_html_jsd: minify_html_jsd
};
gulp.task('one', gulp.parallel(
minify_html,
minify_css,
minify_js
));
gulp.task('two', gulp.parallel(
minify_html_jsd,
minify_css,
minify_js
));
gulp.task('default', gulp.series('one'));
运行
使用gulp后, 发布推送(hexo d
)前需要添加一个命令
# 不使用cdn
$ hexo clean && hexo g && gulp && hexo d
# 使用cdn重定向
$ hexo clean && hexo g && gulp two && hexo d
SEO-sitemap
安装插件
百度
$ npm install hexo-generator-baidu-sitemap --save
谷歌
$ npm install hexo-generator-sitemap --save
编辑主站_config.yml
添加代码
## ======================================================= #
## ==三、SEO 优化二网站地图 npm install hexo-generator-sitemap --save====== #
## npm install hexo-generator-baidu-sitemap --save
## ======================================================= #
sitemap:
path: sitemap.xml
# tag: true
# category: true
baidusitemap:
path: baidusitemap.xml
重新部署
会在public
目录下生成:
sitemap.xml -> 谷歌
baidusitemap.xml -> 百度
$ hexo clean && hexo g
然后分别去对于站点网站提交即可
robots.txt文件
robots.txt是一个纯文本文件,在这个文件中网站管理者可以声明该网站中不想被robots访问的部分,或者指定搜索引擎只收录指定的内容。
当一个搜索机器人(有的叫搜索蜘蛛)访问一个站点时,它会首先检查该站点根目录下是否存在robots.txt,如果存在,搜索机器人就会按照该文件中的内容来确定访问的范围;如果该文件不存在,那么搜索机器人就沿着链接抓取。
另外,robots.txt必须放置在一个站点的根目录下,而且文件名必须全部小写。
在source
目录下 新建 robots.txt
域名改成自己的
robots.txt
# robots.txt www.sizaif.com
User-agent: *
Allow: /sitemap.xml
Allow: /baidusitemap.xml
Allow: /category-sitemap.xml
Allow: /page-sitemap.xml
Allow: /post-sitemap.xml
Allow: /tag-sitemap.xml
Allow: /posts/
Disallow: /css
Disallow: /js
Disallow: /fonts
Disallow: /img
Disallow: /fonts
Disallow: /page
Disallow: /navigation
Disallow: /tags
Disallow: /archives
Disallow: /404.html
Disallow: /403.html
Disallow: /update.log
Disallow: /search.xml
Disallow: /README.md