Sitemap generator
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

148 lines
3.1 KiB

  1. # pysitemap
  2. > Sitemap generator
  3. ## Installation
  4. ```
  5. pip install sitemap-generator
  6. ```
  7. ## Requirements
  8. ```
  9. asyncio
  10. aiofile
  11. aiohttp
  12. ```
  13. ## Example 1
  14. ```
  15. import sys
  16. import logging
  17. from pysitemap import crawler
  18. if __name__ == '__main__':
  19. if '--iocp' in sys.argv:
  20. from asyncio import events, windows_events
  21. sys.argv.remove('--iocp')
  22. logging.info('using iocp')
  23. el = windows_events.ProactorEventLoop()
  24. events.set_event_loop(el)
  25. # root_url = sys.argv[1]
  26. root_url = 'https://www.haikson.com'
  27. crawler(root_url, out_file='sitemap.xml')
  28. ```
  29. ## Example 2
  30. ```
  31. import sys
  32. import logging
  33. from pysitemap import crawler
  34. if __name__ == '__main__':
  35. root_url = 'https://mytestsite.com/'
  36. crawler(
  37. root_url,
  38. out_file='sitemap.xml',
  39. maxtasks=100,
  40. verifyssl=False,
  41. exclude_urls=[
  42. '/git/.*(action|commit|stars|activity|followers|following|\?sort|issues|pulls|milestones|archive|/labels$|/wiki$|/releases$|/forks$|/watchers$)',
  43. '/git/user/(sign_up|login|forgot_password)',
  44. '/css',
  45. '/js',
  46. 'favicon',
  47. '[a-zA-Z0-9]*\.[a-zA-Z0-9]*$',
  48. '\?\.php',
  49. ],
  50. exclude_imgs=[
  51. 'logo\.(png|jpg)',
  52. 'avatars',
  53. 'avatar_default',
  54. '/symbols/'
  55. ],
  56. image_root_urls=[
  57. 'https://mytestsite.com/photos/',
  58. 'https://mytestsite.com/git/',
  59. ],
  60. headers={'User-Agent': 'Crawler'},
  61. # TZ offset in hours
  62. timezone_offset=3,
  63. changefreq={
  64. "/git/": "weekly",
  65. "/": "monthly"
  66. },
  67. priorities={
  68. "/git/": 0.7,
  69. "/metasub/": 0.6,
  70. "/": 0.5
  71. }
  72. )
  73. ```
  74. ### TODO
  75. - big sites with count of pages more then 100K will use more then 100MB
  76. memory. Move queue and done lists into database. Write Queue and Done
  77. backend classes based on
  78. - Lists
  79. - SQLite database
  80. - Redis
  81. - Write api for extending by user backends
  82. ## Changelog
  83. **v. 0.9.3**
  84. Added features:
  85. - Option to enable/disable website SSL certificate verification (True/False)
  86. - Option to exclude URL patterns (`list`)
  87. - Option to provide custom HTTP request headers to web server (`dict`)
  88. - Add support for `<lastmod>` tags (XML)
  89. - Configurable timezone offset for lastmod tag
  90. - Add support for `<changefreq>` tags (XML)
  91. - Input (`dict`): `{ url_regex: changefreq_value, url_regex: ... }`
  92. - Add support for `<priority>` tags (XML)
  93. - Input (`dict`): `{ url_regex: priority_value, url_regex: ... }`
  94. - Reduce default concurrent max tasks from `100` to `10`
  95. **v. 0.9.2**
  96. - todo queue and done list backends
  97. - created very slowest sqlite backend for todo queue and done lists (1000 url writing for 3 minutes)
  98. - tests for sqlite_todo backend
  99. **v. 0.9.1**
  100. - extended readme
  101. - docstrings and code commentaries
  102. **v. 0.9.0**
  103. - since this version package supports only python version `>=3.7`
  104. - all functions recreated but api saved. If You use this package, then
  105. just update it, install requirements and run process
  106. - all requests works asynchronously