Blog Archives
5.52020
Let’s make it, AR Cut and Paste

Implement and try out the buzzed-about technology and think of interesting ways to use it.
This time, I’d like to introduce
AR Cut and Paste
What you take with your smartphone camera is absorbed into Photoshop. Check out the video of how I made it.
The original is here.https://twitter.com/cyrildiagne/status/1256916982764646402
I was so impressed with this magical movement that I made it. What is it that makes this happen? How do I make it? I will explain this to those who are curious. (It’s a prototype I’m currently working on, so I can easily do my own Photoshop if I still install the app! (Please note that you can’t just say, “Oh, no.
Let’s make it right away! There are five steps.
1.Set up a remote connection for Photoshop
2.Build BASNet as an external server.
3.Set up the local server
4. Set up the mobile app
5. cut and paste
1.Set up a remote connection for Photoshop
First, you need to set a password for your Photoshop remote connection.

In the preferences, check “Enable remote connection” and enter your password. The name of the service can be anything.

2.Build BASNet as an external server.
BASBet is a boundary-conscious 3D object detection technology.

https://github.com/NathanUA/BASNet(Image from github.)
After putting BASNet and the learning data into an external server, we move the wrapper (https://github.com/cyrildiagne/basnet-http) for http access. Since it uses Torch, a neural network framework, it is recommended to run it on a machine with a CUDA-compatible Nvidia card and at least 6Gb of RAM.
If you’re using AWS EC2, P2 instance is a good choice because it costs about 100 yen an hour, so it’s good enough for testing.

Keep IPv4 public IP in reserve.
It also describes how to use docker, but I didn’t have permissions to use it, so I built it from source. web server is built with Apache.

3.Set up the local server
We will then configure the local server to access BASNet on the local machine. Here’s the code: https://github.com/cyrildiagne/ar-cutpaste
I use screenpoint to project a centroid of an image onto another image, using OpenCV. So, when you create a new one in Photoshop, it has to be of the specified size or it will not work well.
server/src/ps.py
DOC_WIDTH = 2121 DOC_HEIGHT = 1280 def paste(filename, name, x, y, password='123456'):
The local server needs to be running Python 3.6 or higher, and the default Python on the Mac is the second version, so use a virtual environment to avoid conflicts.
virtualenv -p python3.7 venv source venv/bin/activate pip install -r requirements.txt
I rewrite my EC2 IP (don’t forget port 8080) and Photoshop password and start it. The host name can be omitted.
python src/main.py --basnet_service_ip="http://13.113.28.46:8080" --photoshop_password 123456

4. Set up the mobile app
Check your Mac’s local IP first. System Preferences -> Network, and note the IP address.

In the /app directory, the
npm install
Describe the local IP address in the configuration file components/Server.tsx
const URL = "http://192.168.2.106:8080";
Start
npm start

When you read the QR code with the expo app in, the mobile app will be launched. If you don’t have it in, you can find it by typing EXPO in the app search.
5. cut and paste
Press the screen toward the object to take a picture. Do not remove your fingers.
The square blink disappears after a while, so hold your finger down on the screen and point your phone at the Photoshop screen.
Release your finger when you are close to the screen. After waiting for a while, the blinking square disappears and the thing you just took a picture of appears with the background trimmed.
Let’s capture a lot of them and play with them!

summing up
The build is over. Thank you for your hard work!
It’s still a prototype, so there’s a lot of work to be done before we can get it up and running, but if it evolves and can be installed from the store, it’ll make trimming the design much easier.
But there’s another reason why this technology seems interesting to me. However, this is the opposite approach, objectifying reality and transferring it virtually. The scanning accuracy is still rough, but what if it gets more accurate?
In today’s sense, it’s like taking a lot of photos and uploading them to the cloud, scanning everything around you, turning them into objects, and uploading them into a virtual space. In the past, photos could only be saved in two-dimensional space like an album, but if you can scan 3D objects, you can save them in three dimensions. It raises one dimension of feeling that enriches your avatar and your room. And I walked around the space in VR. Doesn’t that sound like fun?
Not long ago, there was a lot of buzz about the iPadPro2020 model being equipped with LiDAR. This has increased the accuracy of scanning real space. It’s still rough, but it can be scanned in 3D. Once this LiDAR is installed on the iPhone, the market will expand further and there will be more opportunities to stimulate the needs of users who want to scan reality.
The AR Cut and Paste introduced this time is a technology that allows us to feel the future of the dream of the fusion of AR and VR.