Commit Graph

390 Commits (master)

Author SHA1 Message Date
emijrp 2e99d869e2 fixing Wikia images bug, issue #212 8 years ago
emijrp 3f697dbb5b restoring dumpgenerator.py code to f43b7389a0 last stable version. I will rewrite code in wikiteam/ subdirectory 8 years ago
emijrp 4e939f4e98 getting ready to other wiki engines, functions prefixes: MediaWiki (mw), Wikispaces (ws) 8 years ago
nemobis f43b7389a0 Merge pull request #270 from nemobis/master
When we have a working index.php, do not require api.php
8 years ago
Federico Leva 480d421d7b When we have a working index.php, do not require api.php
We work well without api.php. this was a needless suicide.
Especially as sometimes sysadmins like to disable the API for no
reason and then index.php is our only option to archive the wiki.
8 years ago
emijrp 15223eb75b Parsing more image names from HTML Special:Allimages 8 years ago
emijrp e138d6ce52 New API params to continue in Allimages 8 years ago
emijrp 2c0f54d73b new HTML regexp for Special:Allpages 8 years ago
emijrp 4ef665b53c In recent MediaWiki versions, API continue is a bit different 8 years ago
Daniel Oaks 376e8a11a3 Avoid out-of-memory error in two extra places 9 years ago
Tim Sheerman-Chase 877b736cd2 Merge branch 'retry' of https://github.com/TimSC/wikiteam into retry 9 years ago
Tim Sheerman-Chase 6716ceab32 Fix tests 9 years ago
Tim Sheerman-Chase 5cb2ecb6b5 Attempting to fix missing config in tests 9 years ago
Tim 93bc29f2d7 Fix syntax errors 9 years ago
Tim d5a1ed2d5a Fix indentation, use classic string formating 9 years ago
Tim Sheerman-Chase 8380af5f24 Improve retry logic 9 years ago
PiRSquared17 fadd7134f7 What I meant to do, ugh 9 years ago
PiRSquared17 1b2e83aa8c Fix minor error with normpath call 9 years ago
PiRSquared17 5db9a1c7f3 Normalize path/foo/ to path/foo, so -2, etc. work (fixes #244) 9 years ago
Federico Leva 2b78bfb795 Merge branch '2015/iterators' of git://github.com/nemobis/wikiteam into nemobis-2015/iterators
Conflicts:
	requirements.txt
9 years ago
Federico Leva d4fd745498 Actually allow resuming huge or broken XML dumps
* Log "XML export on this wiki is broken, quitting." to the error
  file so that grepping reveals which dumps were interrupted so.
* Automatically reduce export size for a page when downloading the
  entire history at once results in a MemoryError.
* Truncate the file with a pythonic method (.seek and .truncate)
  while reading from the end, by making reverse_readline() a weird
  hybrid to avoid an actual coroutine.
9 years ago
Federico Leva 9168a66a54 logerror() wants unicode, but readTitles etc. give bytes
Fixes #239.
9 years ago
Federico Leva 632b99ea53 Merge branch '2015/iterators' of https://github.com/nemobis/wikiteam into nemobis-2015/iterators 9 years ago
nemobis ff2cdfa1cd Merge pull request #236 from PiRSquared17/fix-server-check-api
Catch KeyError to fix server check
9 years ago
nemobis 0b25951ab1 Merge pull request #224 from nemobis/2015/issue26
Issue #26: Local "Special" namespace, actually limit replies
9 years ago
PiRSquared17 03db166718 Catch KeyError to fix server check 9 years ago
PiRSquared17 f80ad39df0 Make filename truncation work with UTF-8 9 years ago
PiRSquared17 90bfd1400e Merge pull request #229 from PiRSquared17/fix-zwnbsp-bom
Strip ZWNBSP (U+FEFF) Byte-Order Mark from JSON/XML
9 years ago
PiRSquared17 fc276d525f Allow spaces before <mediawiki> tag. 9 years ago
PiRSquared17 1c820dafb7 Strip ZWNBSP (U+FEFF) Byte-Order Mark from JSON/XML 9 years ago
Nemo bis 55e5888a00 Fix UnicodeDecodeError in resume: use kitchen 9 years ago
Federico Leva 14ce5f2c1b Resume and list titles without keeping everything in memory
Approach suggested by @makoshark, finally found the time to start
implementing it.
* Do not produce and save the titles list all at once. Instead, use
  the scraper and API as generators and save titles on the go. Also,
  try to start the generator from the appropriate title.
  For now the title sorting is not implemented. Pages will be in the
  order given by namespace ID, then page name.
* When resuming, read both the title list and the XML file from the
  end rather than the beginning. If the correct terminator is
  present, only one line needs to be read.
* In both cases, use a generator instead of a huge list in memory.
* Also truncate the resumed XML without writing it from scratch.
  For now using GNU ed: very compact, though shelling out is ugly.
  I gave up on using file.seek and file.truncate to avoid reading the
  whole file from the beginning or complicating reverse_readline()
  with more offset calculations.

This should avoid MemoryError in most cases.

Tested by running a dump over a 1.24 wiki with 11 pages: a complete
dump and a resumed dump from a dump interrupted with ctrl-c.
9 years ago
Federico Leva 2537e9852e Make dumpgenerator.py 774: required by launcher.py 9 years ago
Federico Leva 79e2c5951f Fix API check if only index is passed
I forgot that the preceding point only extracts the api.php URL if
the "wiki" argument is passed to say it's a MediaWiki wiki (!).
9 years ago
Federico Leva bdc7c9bf06 Issue 26: Local "Special" namespace, actually limit replies
* For some reason, in a previous commit I had noticed that maxretries
  was not respected in getXMLPageCore, but I didn't fix it. Done now.
* If the "Special" namespace alias doesn't work, fetch the local one.
9 years ago
Federico Leva 2f25e6b787 Make checkAPI() more readable and verbose
Also return the api URL we found.
9 years ago
Federico Leva 48ad3775fd Merge branch 'follow-redirects-api' of git://github.com/PiRSquared17/wikiteam into PiRSquared17-follow-redirects-api 9 years ago
nemobis 2284e3d55e Merge pull request #186 from PiRSquared17/update-headers
Preserve default headers, fixing openwrt test
9 years ago
PiRSquared17 5d23cb62f4 Merge pull request #219 from vadp/dir-fnames-unicode
convert images directory content to unicode when resuming download
9 years ago
PiRSquared17 d361477a46 Merge pull request #222 from vadp/img-desc-load-err
dumpgenerator: catch errors for missing image descriptions
9 years ago
Vadim Shlyakhov 4c1d104326 dumpgenerator: catch errors for missing image descriptions 9 years ago
PiRSquared17 b1ce45b170 Try using URL without index.php as index 9 years ago
PiRSquared17 9c3c992319 Follow API redirects 9 years ago
Vadim Shlyakhov f7e83a767a convert images directory content to unicode when resuming download 9 years ago
Benjamin Mako Hill d2adf5ce7c Merge branch 'master' of github.com:WikiTeam/wikiteam 9 years ago
Benjamin Mako Hill f85b4a3082 fixed bug with page missing exception code
My previous code broke the page missing detection code with two negative
outcomes:

- missing pages were not reported in the error log
- ever missing page generated an extraneous "</page>" line in output which
  rendered dumps invalid

This patch improves the exception code in general and fixes both of these
issues.
9 years ago
PiRSquared17 9480834a37 Fix infinite images loop
Closes #205 (hopefully)
9 years ago
Benjamin Mako Hill eb8b44aef0 strip <sha1> tags returned under <page>
The Wikia API is exporting sha1 sums as part of the response for pages.
These are invalid XML and are causing dump parsing code (e.g.,
MediaWiki-Utilities) to fail.  Also, sha1 should be revisions, not pages so
it's not entirely clear to me what this is referring to.
9 years ago
Benjamin Mako Hill 145b2eaaf4 changed getXMLPage() into a generator
The program tended to run out of memory when processing very large pages (i.e.,
pages with extremely large numbers of revisions or pages with large numbers of
very large revisions). This mitigates the problem by changing getXMLPage() into
a generator which allows us to write pages after each request to the API.

This requied changes to the getXMLPage() function and also changes to other
parts of the code that called it.

Additionally, when the function was called, it's text was checked in several
ways. This required a few changes including a running tally of revisions
instead of post hoc check and it required error checking being moved into a
Exception rather than just an if statement that looked at the final result.
9 years ago
nemobis b3ef165529 Merge pull request #194 from mrshu/mrshu/dumpgenerator-pep8fied
dumpgenerator: AutoPEP8-fied
10 years ago
mr.Shu 04446a40a5 dumpgenerator: AutoPEP8-fied
* Used autopep8 to made sure the code looks nice and is actually PEP8
  compliant.

Signed-off-by: mr.Shu <mr@shu.io>
10 years ago
nemobis e0f8e36bf4 Merge pull request #190 from PiRSquared17/api-allpages-disabled
Fallback to getPageTitlesScraper() if API allpages disabled
10 years ago
PiRSquared17 757019521a Fallback to scraper if API allpages disabled 10 years ago
PiRSquared17 4b3c862a58 Comment debugging print, fix test 10 years ago
PiRSquared17 7a1db0525b Add more wiki engines to getWikiEngine 10 years ago
PiRSquared17 4ceb9ad72e Preserve default headers, fixing openwrt test 10 years ago
PiRSquared17 b4818d2985 Avoid infinite loop in getImageNamesScraper 10 years ago
nemobis 8a9b50b51d Merge pull request #183 from PiRSquared17/patch-7
Retry on ConnectionError in getXMLPageCore
10 years ago
nemobis 19c48d3dd0 Merge pull request #180 from PiRSquared17/patch-2
Get as much information from siteinfo as possible
10 years ago
Pi R. Squared f7187b7048 Retry on ConnectionError in getXMLPageCore
Previously it just gave a fatal error.
10 years ago
Pi R. Squared f31e4e6451 Dict not hashable, also not needed
Quick fix.
10 years ago
Pi R. Squared 399f609d70 AllPages API hack for old versions of MediaWiki
New API format: http://www.mediawiki.org/w/api.php?action=query&list=allpages&apnamespace=0&apfrom=!&format=json&aplimit=500
Old API format: http://wiki.damirsystems.com/api.php?action=query&list=allpages&apnamespace=0&apfrom=!&format=json
10 years ago
Pi R. Squared 498b64da3f Try getting index.php from siteinfo API
Fixes #49
10 years ago
Pi R. Squared ff0d230d08 Get as much information from siteinfo as possible
Properly fixes #74.

Algorithm:
1. Try all siteinfo props. If this gives an error, continue. Otherwise, stop.
2. Try MediaWiki 1.11-1.12 siteinfo props. If this gives an error, continue. Otherwise, stop.
3. Try minimal siteinfo props. Stop.
Not using sishowalldb=1 to avoid possible error (by default), since this data is of little use anyway.
10 years ago
Pi R. Squared 322604cc23 Encode title using UTF-8 before printing
This fixes #170 and closes #174.
10 years ago
nemobis 11368310ee Merge pull request #173 from nemobis/issue/131
Fix #131: ValueError: No JSON object could be decoded
10 years ago
Nemo bis 026c2a9a25 Issue 131: ValueError: No JSON object could be decoded 10 years ago
Sean Yeh 38e73c1cf7 Fix argument parsing to accept delay as a number 10 years ago
Emilio J. Rodríguez-Posada a2efca27b8 improving API/Index calculate 10 years ago
Emilio J. Rodríguez-Posada 4bc43a1c0f improved help messages 10 years ago
Emilio J. Rodríguez-Posada 51806f5a3d fixed #160; improved args parsing and --help; improved API/Index estimate from URL; 10 years ago
Emilio J. Rodríguez-Posada dd7df0cc01 Merge branch 'master' of https://github.com/WikiTeam/wikiteam 10 years ago
Emilio J. Rodríguez-Posada f3b388fc79 a first approach to auto-detect API/Index.php using URL to the Main_Page 10 years ago
Erkan Yilmaz 44b80ceb88 fix link for tutorial 10 years ago
balr0g 8485a5004d Pass session 10 years ago
balr0g fd6ea19b4b config['api'] is set but empty; properly handle this 10 years ago
nemobis 1ff96238eb Denote as alpha until revamp is tested
Per emijrp who asked not to run dumps with this, at https://github.com/WikiTeam/wikiteam/issues/104#issuecomment-48039143
Currently proposed things to fix or check: https://github.com/WikiTeam/wikiteam/issues?milestone=1&state=open
10 years ago
Emilio J. Rodríguez-Posada 89e3c3e462 standarize getImage* functions names 10 years ago
Emilio J. Rodríguez-Posada aaa1822759 improving image list downloader 10 years ago
Emilio J. Rodríguez-Posada 88c9468c0e improving image list downloader 10 years ago
balr0g 3929e4eb9c Cleanups and error fixes suggested by flake8 (pep8 + pyflakes) 10 years ago
Emilio J. Rodríguez-Posada c07b527e5d adding session to getWikiEngine() 10 years ago
Emilio J. Rodríguez-Posada 30c153ce1f chg: using 'with open' for files 10 years ago
balr0g 9aa3c4a0e1 Removed all traces of urllib except for encode/decode; more bugs fixed. 10 years ago
balr0g c8e11a949b Initial port to Requests 10 years ago
Emilio J. Rodríguez-Posada 9553e3550c adding wiki engine detector 10 years ago
Emilio J. Rodríguez-Posada eb97cf1adf version 0.2.2 and tiny bits in --help 10 years ago
balr0g 50b011f90d Initial port to argparse 10 years ago
Emilio J. Rodríguez-Posada 568deef081 adding comments for clarification 10 years ago
Emilio J. Rodríguez-Posada d4eed1f738 fixing #127 and #134 , now works with APIs that returns 'name' field for images and those that don't do it (in this case we unquote over ascii); also fixing bug that re-download image list when it was completed previously 10 years ago
Emilio J. Rodríguez-Posada 005de23c1d adding gzip to siteinfo downloader 10 years ago
Emilio J. Rodríguez-Posada d79ea64d41 fixing issue #97 pretty siteinfo json saving, indenting 4 chars 10 years ago
Emilio J. Rodríguez-Posada 3854a344fe Merge branch 'master' of https://github.com/WikiTeam/wikiteam 10 years ago
Emilio J. Rodríguez-Posada 1c1f0dbb86 replacing XML with JSON in image downloading 10 years ago
balr0g 481323c7f7 Don't try to download sites with disabled API 10 years ago
nemobis 1933db8a94 Merge pull request #124 from balr0g/scraper-unicode-title-fix
Fix scraper for sites with Unicode titles
10 years ago
balr0g 62be069026 Fix scraper for sites with Unicode titles 10 years ago
nemobis 62d961fa97 Fix typo, unused variable spotted by balrog 10 years ago
nemobis 95bc2dec38 Link GitHub issue tracker 10 years ago
balr0g d60e560571 Add Content-Encoding: gzip support 10 years ago
Emilio J. Rodríguez-Posada 5261811fa4 only if api exists 10 years ago
Emilio J. Rodríguez-Posada 610764619a add saveSiteInfo() to download meta=siteinfo data from API to a file 10 years ago
Emilio J. Rodríguez-Posada d395433513 comments and newlines 10 years ago
Emilio J. Rodríguez-Posada 5eff4bd072 comments and tabs 10 years ago
Emilio J. Rodríguez-Posada 0b0c40f5da adding more user-agents, but keeps the first as default by now 10 years ago
Emilio J. Rodríguez-Posada 81468c4a7c using JSON to retrieve namespaces via API 10 years ago
Emilio J. Rodríguez-Posada 703eb9011b improving checkAPI() using JSON properly loaded 10 years ago
Emilio J. Rodríguez-Posada 44d3fe1e36 Merge pull request #117 from nemobis/bug/48
Issue 48: Check that API actually works
10 years ago
Emilio J. Rodríguez-Posada fc80556d8a merging... 10 years ago
Emilio J. Rodríguez-Posada f474deb71f now we use JSON properly in getPageTitlesAPI(), instead of XML; fixing some wrong prints, now support utf-8 10 years ago
Federico Leva 997276110c Issue 46: dumpgenerator should follow redirects
Patch by @balr0g from libsonic (GPLv3+).
10 years ago
Federico Leva a8e1575879 Issue 48: Check that API actually works 10 years ago
Emilio J. Rodríguez-Posada c9aa165504 fixing header with the new year, info and documentation link 10 years ago
nemobis ac4c93c12a Issue 85: more cross-platform shebang on all scripts
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@962 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
10 years ago
nemobis 403dc213ef Issue 71: English-only match for an older case
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@942 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
10 years ago
nemobis 0ede45b7cf Special:BadTitle works only in English wikis
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@917 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
10 years ago
nemobis 034866a32e Handle permissions-errors for wikis requiring login or whatever
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@916 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
10 years ago
nemobis 6c69d9800f Followup, delay needs config; should be BC
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@914 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
10 years ago
nemobis 55185467e1 Add delay to all checking and listing functions, crappy hosts die on them
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@902 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
10 years ago
nemobis 6efe406ea5 Followup r877, first check most common conditions for shortcut performance
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@882 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
Hydriz 611d13f8c5 Follow up r877, check the number of revision tags
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@878 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
Hydriz 64bd837cab (Issue 34) XML integry check inside the code
This *really* fixes the issue and asks the user whether or not to regenerate a dump.


git-svn-id: https://wikiteam.googlecode.com/svn/trunk@877 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
Hydriz 79047a3ded (Issue 71) Use a better check for private wikis
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@873 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis 26873ad495 Fix typo, make domain2prefix quiet again
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@869 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis 626118cfab Let's call it 0.2 then, a bump to 1 would require announcements etc. We're not there yet (API support etc.).
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@867 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
Hydriz df1e7efafd Change version of dumpgenerator.py to 1.1. Using 0.1 is rather confusing.
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@866 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis e1d4de3179 Uncomment appended index.php for guess in most configurations
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@864 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis 5d34d9512a Needs to be non-matching group
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@863 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis 82ba173739 Issue 22: allimages now uses aicontinue, not aifrom
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@862 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis c6546ff935 Issue 71: Don't try to download private wikis, first workaround
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@861 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis 776038666f Issue 72: revert r857, just define everything in launcher.py
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@860 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis 6113fa3340 Add delay to getPageTitlesScraper
We must be nice here too or naughty hosts fail badly, for instance wikkii.com gave

urllib2.HTTPError: HTTP Error 302: The HTTP server returned a redirect error that would lead to an infinite loop.
The last 30x error message was:
Moved Temporarily



git-svn-id: https://wikiteam.googlecode.com/svn/trunk@859 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis 9e1b13e173 Correct --help: format is --delay=5, not --delay:5
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@858 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis 6430ac5f47 Check for the existence of the array in domain2prefix instead; uploader.py failed on python 2.6
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@857 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis ef7d527e86 Add some advice about editthis.info for usage via launcher.py
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@855 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis 4820339d10 Fix r842, patch by balrog; Schbirid reported python error in CleanHTML
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@854 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis 7c94815e2c Issue 68: Use GET, not POST, to download images; some harm and no? good
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@851 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis e1b34b7f6b Fix whitespace
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@844 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis 57e226c049 Use urllib2 and set user agent in some more places; some webhosts block urllib.
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@842 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
Hydriz 414fb1988f Add a little more singular/plural support.
Along with this commit:
* Changed original %s to %d for number of edits, since it is recognised as an integer.
* Directly defined the number of edits to be 1 when the if condition is true, to optimise performance.


git-svn-id: https://wikiteam.googlecode.com/svn/trunk@841 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis 13ebee4c28 Issue 60: Add authentication cookie support, patch by Fredrik Roubert
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@840 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis b34f01646a Use getUserAgent in one more place, urllib is blocked by some
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@839 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
Hydriz 8b4480be64 Adding plural support for number of edits saved for a page
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@831 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
nemobis ed8d174d5a Issue #61: some skins hide that stuff, use meta tag generator
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@829 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
emijrp f1874656ed comments
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@818 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
emijrp af81adebeb comments
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@817 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
emijrp 424588a55c comments
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@816 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
emijrp 360d1c1fa1 fixing createnewdump() and resumepreviousdump()
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@815 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
emijrp 8295990df0 moving code to functions; tiny changes in comments
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@814 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago
emijrp 79a310c470 tiny changes in comments, some clarifications
git-svn-id: https://wikiteam.googlecode.com/svn/trunk@813 31edc4fc-5e31-b4c4-d58b-c8bc928bcb95
11 years ago