In the beginning, there was no difference between using a computer and programming. That was the only way they worked; you created a program with punchcards or whatever, and the computer ran it. Well, not you. Your dad maybe.
Then some history occurred, and you could use a computer without writing your own programs for it or even knowing how. And then some more stuff happened, but using a computer was sometimes similar to programming still. This was during the era of interactive prompts, like on the Apple II computers or in DOS sort of. You’d type commands and the computer would run them, and sometimes you’d save a lot of commands and the computer would run all of them in a row, and you could save them for later.
Then people started writing fancy UIs for computers, and those caught on, and computer use spread. But now using a computer is really totally different from programming one again. I know there are higher level scripting languages, but there’s still a pretty thick border between writing code and running applications.
So why hasn’t programming kept up? If we can make a simple, “intuitive” UI layer for our operating systems, why don’t we have simple, intuitive UIs for programming? Why is does every programmer still use text editor, however fancy?
I really don’t know what a textless-UI development environment would look like or how it would work. I guess it would be similar to visual basic or MFC or whatever the current generation of that stuff is, but those have always forced you to know syntax for an actual language, and write some text yourself. Seems like WYSIWYG programming is right out, not even really making sense, but there’s got to be some middle ground.