Skip to content
GitLab
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
  • O openapi-generator
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 3,476
    • Issues 3,476
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 402
    • Merge requests 402
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Packages and registries
    • Packages and registries
    • Package Registry
    • Infrastructure Registry
  • Monitor
    • Monitor
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Repository
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • OpenAPI Tools
  • openapi-generator
  • Issues
  • #11421
Closed
Open
Issue created Jan 26, 2022 by Administrator@rootContributor5 of 6 checklist items completed5/6 checklist items

[BUG] [Python-flask] generated test code is buggy - samples/openapi3/server/petstore/python-flask

Created by: Johnlon

Bug Report Checklist

  • Have you provided a full/minimal spec to reproduce the issue?
  • Have you validated the input using an OpenAPI validator (example)?
  • Have you tested with the latest master to confirm the issue still exists?
  • Have you searched for related issues/PRs?
  • What's the actual output vs expected output?
  • [Optional] Sponsorship to speed up the bug fix or feature request (example)

This hurts for a few reasons

  • I want to make a change but I need a working platform to start with and I don't have that.
  • Easily discoverable issues damage the reputation of the product and it's development practices. I assue the build pipeline does not run all the tests for the generated samples so how can one expect those samples to remain healthy - they just rot.
  • There seems to be a practice of not regenerating and committing the updated samples each time a change is made (and the tests run) . Unless this is done then we are leaving time bombs in the code because the impact of a given change isn't felt by other generator's samples until months or years later.

Reproduce by ...

cd samples/openapi3/server/petstore/python-flask
tox

Runs the tests but they fail.

Description

The generated code is buggy.

As a good dev I ran the existing tests before changing anything and got a bunch of fails. Some of the fails suggest that the generated code is not valid - ie there's an existing problem in the generator. I'd hoped that the sample would be clean and pass and then I could start work.

Many of the tests fail due to 403's but I'd also consider that as a bug as the tests just ought to work either way.

If the generator/template buggy already then is it worth me continuing with this modification? I'd like to start with a working environment - is there any help to straighten this out?

Running "tox" I get ...

openapi_server/test/test_user_controller.py FFFF.F.F                                                                                                                                
openapi_server/test/test_pet_controller.py s.s.s.Fs.                                                                                                                                 openapi_server/test/test_store_controller.py .F..

If I regenerate the sample then the situation is also imperfect - actually slight worse I think...

openapi_server/test/test_user_controller.py .FFFFFF.                                                                                                                                 
openapi_server/test/test_store_controller.py ...F                                                                                                                                    
openapi_server/test/test_pet_controller.py .ssF..ss.

Some samples ..

WARNING  connexion.operations.openapi3:openapi.py:269 this operation accepts multiple content types, using application/json
____________________________________________________________________ TestPetController.test_update_pet_status_with_enum ____________________________________________________________________

self = <openapi_server.test.test_pet_controller.TestPetController testMethod=test_update_pet_status_with_enum>

    def test_update_pet_status_with_enum(self):
        """Test case for update_pet_status_with_enum

        Set the status of a pet in the store using an enum
        """
>       query_string = [('status', pending)]
E       NameError: name 'pending' is not defined

openapi_server/test/test_pet_controller.py:165: NameError
------------------------------------------------------------------------------------ Captured log call -------------------------------------------------------------------------------------
WARNING  connexion.operations.openapi3:openapi.py:269 this operation accepts multiple content types, using application/json
___________________________________________________________________________ TestStoreController.test_place_order ___________________________________________________________________________

self = <openapi_server.test.test_store_controller.TestStoreController testMethod=test_place_order>

        def test_place_order(self):
            """Test case for place_order

            Place an order for a pet
            """
            order = {
      "petId" : 6,
      "quantity" : 1,
      "id" : 0,
      "shipDate" : "2000-01-23T04:56:07.000+00:00",
>     "complete" : false,
      "status" : "placed"
    }
E   NameError: name 'false' is not defined

openapi_server/test/test_store_controller.py:71: NameError
------------------------------------------------------------------------------------ Captured log call -------------------------------------------------------------------------------------
WARNING  connexion.operations.openapi3:openapi.py:269 this operation accepts multiple content types, using application/json

WARNING  connexion.operations.openapi3:openapi.py:269 this operation accepts multiple content types, using application/json
___________________________________________________________________________ TestUserController.test_logout_user ____________________________________________________________________________

self = <openapi_server.test.test_user_controller.TestUserController testMethod=test_logout_user>

    def test_logout_user(self):
        """Test case for logout_user

        Logs out current logged in user session
        """
        headers = {
            'auth_cookie': 'special-key',
        }
        response = self.client.open(
            '/v2/user/logout',
            method='GET',
            headers=headers)
        self.assert200(response,
>                      'Response body is : ' + response.data.decode('utf-8'))

openapi_server/test/test_user_controller.py:161:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.tox/py3/lib/python3.6/site-packages/flask_testing/utils.py:336: in assert200
    self.assertStatus(response, 200, message)
.tox/py3/lib/python3.6/site-packages/flask_testing/utils.py:324: in assertStatus
    self.assertEqual(response.status_code, status_code, message)
E   AssertionError: 401 != 200 : Response body is : {
E     "detail": "No authorization token provided",
E     "status": 401,
E     "title": "Unauthorized",
E     "type": "about:blank"
E   }
------------------------------------------------------------------------------------ Captured log call -------------------------------------------------------------------------------------
WARNING  connexion.operations.openapi3:openapi.py:269 this operation accepts multiple content types, using application/json
W

The web app starts though so perhaps only the tests are mangled?

openapi-generator version

Current master - I'm trying to add a feature

Generation Details

Existing master samples at the current date time

Steps to reproduce
Assignee
Assign to
Time tracking