Importing issue from mailman2 to mailman3
Hi,
I'm in the process of moving 1400-ish lists from a very old mailman2 installation to an installation on Ubuntu 18.04 with the version of mailman3 packaged for that distro. (Yes, that was a mistake, but I'm committed at this point.)
I'm doing this import as per:
sudo -u list mailman import21 somelist@one-of-the-domains<mailto:somelist@one-of-the-domains> /tmp/lists/somelist/config.pck
where /tmp/lists includes all of the list configs from the old installation.
Most of it's going fine, as best I can see. However, I get exceptions on some lists that look like this:
Traceback (most recent call last): File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1182, in _execute_context context) File "/usr/lib/python3/dist-packages/sqlalchemy/engine/default.py", line 470, in do_execute cursor.execute(statement, parameters) File "/usr/local/lib/python3.6/dist-packages/pymysql/cursors.py", line 163, in execute result = self._query(query) File "/usr/local/lib/python3.6/dist-packages/pymysql/cursors.py", line 321, in _query conn.query(q) File "/usr/local/lib/python3.6/dist-packages/pymysql/connections.py", line 505, in query self._affected_rows = self._read_query_result(unbuffered=unbuffered) File "/usr/local/lib/python3.6/dist-packages/pymysql/connections.py", line 724, in _read_query_result result.read() File "/usr/local/lib/python3.6/dist-packages/pymysql/connections.py", line 1069, in read first_packet = self.connection._read_packet() File "/usr/local/lib/python3.6/dist-packages/pymysql/connections.py", line 676, in _read_packet packet.raise_for_error() File "/usr/local/lib/python3.6/dist-packages/pymysql/protocol.py", line 223, in raise_for_error err.raise_mysql_exception(self._data) File "/usr/local/lib/python3.6/dist-packages/pymysql/err.py", line 107, in raise_mysql_exception raise errorclass(errno, errval) pymysql.err.DataError: (1406, "Data too long for column 'info' at row 1")
My assumption is that what's in the "Info" field on the old mailman2 config is simply too long. Is that correct?
In that case, I suppose the right thing is to (1) edit each one, reducing the size, which'll create a new config.pck, (2) pull it over to the new machine, and (3) run the import again.
Seem reasonable?
Cheers,
David
On 3/7/21 3:42 AM, David Partain via Mailman-users wrote:
Most of it's going fine, as best I can see. However, I get exceptions on some lists that look like this:
Traceback (most recent call last):
...
pymysql.err.DataError: (1406, "Data too long for column 'info' at row 1")
My assumption is that what's in the "Info" field on the old mailman2 config is simply too long. Is that correct?
Yes. For MySQL and MariaDB the info column is VARCHAR(255). It probably should be defined as a type that allows a longer value, but for now with those DB engines, it is limited to 255 bytes.
In that case, I suppose the right thing is to (1) edit each one, reducing the size, which'll create a new config.pck, (2) pull it over to the new machine, and (3) run the import again.
Seem reasonable?
Yes, that's the workaround. Or you could start over with postgresql as the DB which doesn't have that limitation.
-- Mark Sapiro <mark@msapiro.net> The highway is for gamblers, San Francisco Bay Area, California better use your sense - B. Dylan
Hi Mark,
Most of it's going fine, as best I can see. However, I get exceptions on some lists that look like this:
Traceback (most recent call last): ... pymysql.err.DataError: (1406, "Data too long for column 'info' at row 1")
My assumption is that what's in the "Info" field on the old mailman2 config is simply too long. Is that correct?
Yes. For MySQL and MariaDB the info column is VARCHAR(255). It probably should be defined as a type that allows a longer value, but for now with those DB engines, it is limited to 255 bytes.
In that case, I suppose the right thing is to (1) edit each one, reducing the size, which'll create a new config.pck, (2) pull it over to the new machine, and (3) run the import again. Seem reasonable?
Yes, that's the workaround. Or you could start over with postgresql as the DB which doesn't have that limitation.
Thanks! That did the trick.
I won't switch today :) I just activated the new server and am serving 1400 lists from there even as we speak.
Cheers,
David
participants (2)
-
David Partain
-
Mark Sapiro