Skip to content

Python3 Compatiable #188

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed

Python3 Compatiable #188

wants to merge 1 commit into from

Conversation

ralic
Copy link

@ralic ralic commented Dec 11, 2017

Built and tested on Ubuntu

@dsoprea
Copy link
Owner

dsoprea commented Dec 12, 2017

Thank you for your contribution.

I ran into issues. You didn't specify a specific version, so I tried with 3.4 .

I needed the following to even get off the ground:

diff --git a/gdrivefs/oauth_authorize.py b/gdrivefs/oauth_authorize.py
index 09efac5..243d85e 100644
--- a/gdrivefs/oauth_authorize.py
+++ b/gdrivefs/oauth_authorize.py
@@ -37,7 +37,7 @@ class OauthAuthorize(object):
 
         api_credentials = gdrivefs.conf.Conf.get('api_credentials')
 
-        with tempfile.NamedTemporaryFile() as f:
+        with tempfile.NamedTemporaryFile(mode='r+') as f:
             json.dump(api_credentials, f)
             f.flush()
 
@@ -89,7 +89,7 @@ class OauthAuthorize(object):
             _LOGGER.debug("Checking for cached credentials: %s",
                           self.__creds_filepath)
 
-            with open(self.__creds_filepath) as cache:
+            with open(self.__creds_filepath, 'rb') as cache:
                 credentials_serialized = cache.read()
 
             # If we're here, we have serialized credentials information.
@@ -139,7 +139,7 @@ class OauthAuthorize(object):
 
         # Write cache file.
 
-        with open(self.__creds_filepath, 'w') as cache:
+        with open(self.__creds_filepath, 'wb') as cache:
             cache.write(credentials_serialized)
 
     def step2_doexchange(self, auth_code):
diff --git a/gdrivefs/resources/scripts/gdfstool b/gdrivefs/resources/scripts/gdfstool
index 9722bbc..b1255ed 100755
--- a/gdrivefs/resources/scripts/gdfstool
+++ b/gdrivefs/resources/scripts/gdfstool
@@ -163,6 +163,10 @@ def main():
 
     args = p.parse_args()
 
+    if args.command is None:
+        p.print_help()
+        sys.exit(1)
+
     gdrivefs.config.log.configure(is_debug=args.verbose)
 
     if args.command == 'auth_get_url':

I added a check for a subcommand because, apparently, a subcommand is no longer implicitly required in Python 3.4.

After the changes above were introduced, I got the following issues:

$ gdfs -v default /tmp/gdfs/
2017-12-12 00:49:57,755 [__main__ DEBUG] Mounting GD with creds at [default]: /tmp/gdfs/
2017-12-12 00:49:57,755 [gdrivefs.gdfuse DEBUG] PERMS: F=777 E=666 NE=444
2017-12-12 00:49:57,757 [gdrivefs.drive DEBUG] Getting authorized HTTP tunnel.
2017-12-12 00:49:57,757 [gdrivefs.drive DEBUG] Got authorized tunnel.
2017-12-12 00:49:57,758 [googleapiclient.discovery_cache WARNING] file_cache is unavailable when using oauth2client >= 4.0.0
Traceback (most recent call last):
  File "/home/local/MAGICLEAP/doprea/.virtualenvs/gdrivefs.188/lib/python3.4/site-packages/googleapiclient/discovery_cache/__init__.py", line 36, in autodetect
    from google.appengine.api import memcache
ImportError: No module named 'google'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/local/MAGICLEAP/doprea/.virtualenvs/gdrivefs.188/lib/python3.4/site-packages/googleapiclient/discovery_cache/file_cache.py", line 33, in <module>
    from oauth2client.contrib.locked_file import LockedFile
ImportError: No module named 'oauth2client.contrib.locked_file'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/local/MAGICLEAP/doprea/.virtualenvs/gdrivefs.188/lib/python3.4/site-packages/googleapiclient/discovery_cache/file_cache.py", line 37, in <module>
    from oauth2client.locked_file import LockedFile
ImportError: No module named 'oauth2client.locked_file'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/local/MAGICLEAP/doprea/.virtualenvs/gdrivefs.188/lib/python3.4/site-packages/googleapiclient/discovery_cache/__init__.py", line 41, in autodetect
    from . import file_cache
  File "/home/local/MAGICLEAP/doprea/.virtualenvs/gdrivefs.188/lib/python3.4/site-packages/googleapiclient/discovery_cache/file_cache.py", line 41, in <module>
    'file_cache is unavailable when using oauth2client >= 4.0.0')
ImportError: file_cache is unavailable when using oauth2client >= 4.0.0
2017-12-12 00:49:57,758 [googleapiclient.discovery INFO] URL being requested: GET https://www.googleapis.com/discovery/v1/apis/drive/v2/rest
2017-12-12 00:49:57,959 [googleapiclient.discovery INFO] URL being requested: GET https://www.googleapis.com/drive/v2/about?alt=json
fuse: bad mount point `/tmp/gdfs/': Input/output error
Traceback (most recent call last):
  File "/home/local/MAGICLEAP/doprea/.virtualenvs/gdrivefs.188/bin/gdfs", line 6, in <module>
    exec(compile(open(__file__).read(), __file__, 'exec'))
  File "/home/local/MAGICLEAP/doprea/development/python/gdrivefs.188/gdrivefs/resources/scripts/gdfs", line 62, in <module>
    _main()
  File "/home/local/MAGICLEAP/doprea/development/python/gdrivefs.188/gdrivefs/resources/scripts/gdfs", line 59, in _main
    option_string=option_string)
  File "/home/local/MAGICLEAP/doprea/development/python/gdrivefs.188/gdrivefs/gdfuse.py", line 872, in mount
    **fuse_opts)
  File "/home/local/MAGICLEAP/doprea/.virtualenvs/gdrivefs.188/lib/python3.4/site-packages/fuse.py", line 480, in __init__
    raise RuntimeError(err)
RuntimeError: 1

@dsoprea
Copy link
Owner

dsoprea commented Jan 28, 2019

I got this project into compatibility with 3.6 as of last night.

@dsoprea dsoprea closed this Jan 28, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants