Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A GPT-powered CoPilot-like DRAGON to erode software developer profession even further #21

Open
KOLANICH opened this issue Nov 5, 2022 · 1 comment

Comments

@KOLANICH
Copy link

KOLANICH commented Nov 5, 2022

Hi. While making an argument about why non-English-based programming languages are unsuitable for practical software development (and using DRAGON (BTW it was the first "programming language" I have learned in the age of 5, though have never used for anything practical) as one of the examples) I have come with an idea that I want to share with you.

As we know, the unreacheable holy grail of programming is writing programs in the form "makeЗа36ись();".

Icons even in a natural-language-based dialect of DRAGON usually contain kinda simple actions and the flowchart defines just the high-level structure.

It can be possible to transform these simple actions into pieces of code with a neural network capable transforming natural language into code, like GitHub CoPilot or other GPT-based networks. But these pieces of code should be stitched together, this can be done by doing everything incrementaly, one-icon at time, because GPT-based networks are auto-completers. So basically one needs to transform a DRAGON-scheme into a program one icon at time, starting from a bare stub filled in with algo name, injecting natural language descriptions from action icons into the AST, then calling auto-completion generating. IDK how to do it the best, I guess the best control over the autocompletion will be achieved if every Action icon is temporarily mapped to a function (this way specifying its input arguments) outputting all the state of the program, then inlining its generated body, but I'm not sure if GPT will remember the context. I guess the best for remembering the context is to define the function immediately where it is called and then call it, to help GPT to know the context. For Question icons a predicate lambda should be generated.

I guess it can be a way to transform a natural language-based DRAGON schemes into the programs in programming languages. Of course they will likely require some postprocessing to beckme workable. It can be tried to be done with GPT too, piping it with the wrong source, compiler error and the commanf to fix the error, then triggering autocompletion.

@KOLANICH
Copy link
Author

KOLANICH commented Nov 5, 2022

P.S. I consider all the oversimplified (with the purpose to sell them to non-programmers) and/or visual-first programming languages as children toys incompatible with the approaches used in the present day software development. For example inputting graphic representations with mouse is extremily inconvenient (and this will likely not be fixed until brain-computer interfaces 1. become mature enough to replace keyboard and mouse completely and be robust enough to read application-specific mind commands, 2. people become enough fluent with the technology (it'd likely require a new generation to be born and plugged into BCIs since their births) ), and all the tools around programming languages (VCSs, static analysers) are line-based rather than AST,CFG and DFG-based.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant