-
Notifications
You must be signed in to change notification settings - Fork 136
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RUM doesn't work well with UIControl
s that have nested views
#1212
Comments
Hello @blindmonkey 👋 Thank you for the detailed report. What do you mean by 'When we use a Our instrumentation will capture all touches on the screen and so we have to apply filters to detect if the view can actually receive taps. Our filtering is pretty strict indeed and we don't consider tap-gestures, this is something that we should look at 👍 I will bring this to the team for consideration. In the meantime, I would suggest to:
stopUserAction(type: .tap, name: "My Custom Action", attributes: [:] ) |
Hi @maxep, Thanks for your response! Our use case is a bit complex. We have a generic Your proposed solution to manually report the tap is definitely something that we're considering, but I have a couple questions on that front:
Modifying the |
Hey @blindmonkey, Sorry, I copied the wrong method. The DDGlobal.rum.addUserAction(type: .tap, name: "My Custom Action", attributes: [:])
Indeed, if the view that has a tap-gesture is itself a Giving the ability to provide a custom |
Hi @maxep, Thanks for clearing things up about
This does seem pretty straightforward. I'll check with my team to see determine the direction we want to take and possibly make a PR to implement this.
This does sound great, and while we're on the topic it would be great to discuss support for any But discussing generic support for gesture recognizers also makes me wonder whether it could be possible to provide generic support for To be clear, I'm not proposing this instead of the support for |
I'm trying to enable RUM for our application, and running into some issues. Our application has lots of custom
UIControl
subclasses and usesUITapGestureRecognizer
pretty heavily. However, RUM doesn't seem to do a great job of determining which view to track for the event in either of these cases.The issue here seems to stem from the logic implemented here. In both of the cases described above, this method behaves incorrectly for our use case.
When we use a
UITapGestureRecognizer
, the reported view is not the view that theUITapGestureRecognizer
is actually handling, but is instead a subview. In this case, the logic linked above will ignore the view that actually received the event (if it's not aUIControl
) and find the nearestsuperview
that's a subclass ofUITableViewCell
orUICollectionViewCell
. This isn't ideal for arbitraryUIView
s that have aUITapGestureRecognizer
attached, but it's understandable.When we use a
UIControl
(which internally uses aUITapGestureRecognizer
) the situation is essentially the same -- a subview is attached to theUITouch
event, that subview is (usually) not aUIControl
subclass, and the code finds the nearestsuperview
that's a subclass ofUITableViewCell
orUICollectionViewCell
. However, I'd argue that in this case it should figure out to report ourUIControl
. Furthermore, thesendActions
calls that ourUIControl
actually emits aren't tracked by RUM and tracking those explicitly would create double tracking unless we ignore all cells in theUIKitRUMUserActionsPredicate
and add code to track those manually as well.It seems like a better approach would be to add a
|| parent is UIControl
condition when traversing the hierarchy, and reporting that as thetargetView
. Alternatively, it would be even better if we could override the defaultbestActionTarget(for:)
logic with custom logic, perhaps as part of theprotocol UIKitRUMUserActionsPredicate
.The text was updated successfully, but these errors were encountered: